Feb 17 00:05:41 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 00:05:41 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:41 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:42 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 00:05:42 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 00:05:42 crc kubenswrapper[4791]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 00:05:42 crc kubenswrapper[4791]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 00:05:42 crc kubenswrapper[4791]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 00:05:42 crc kubenswrapper[4791]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 00:05:42 crc kubenswrapper[4791]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 00:05:42 crc kubenswrapper[4791]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.957333 4791 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970878 4791 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970913 4791 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970923 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970931 4791 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970943 4791 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970951 4791 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970959 4791 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970967 4791 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970975 4791 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970984 4791 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.970996 4791 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971008 4791 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971018 4791 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971027 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971036 4791 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971044 4791 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971052 4791 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971061 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971069 4791 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971080 4791 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971089 4791 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971098 4791 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971132 4791 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971141 4791 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971150 4791 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971159 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971167 4791 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971176 4791 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971191 4791 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971199 4791 feature_gate.go:330] unrecognized feature gate: Example Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971208 4791 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971216 4791 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971224 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971232 4791 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971240 4791 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971248 4791 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971255 4791 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971263 4791 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971272 4791 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971280 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971289 4791 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971297 4791 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971304 4791 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971312 4791 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971320 4791 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971328 4791 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971336 4791 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971344 4791 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971351 4791 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971362 4791 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971372 4791 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971381 4791 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971389 4791 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971396 4791 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971404 4791 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971413 4791 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971422 4791 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971430 4791 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971437 4791 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971445 4791 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971452 4791 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971461 4791 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971468 4791 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971476 4791 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971483 4791 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971491 4791 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971499 4791 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971507 4791 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971517 4791 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971527 4791 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.971537 4791 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972616 4791 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972641 4791 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972657 4791 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972668 4791 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972679 4791 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972690 4791 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972701 4791 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972713 4791 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972722 4791 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972731 4791 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972741 4791 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972752 4791 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972761 4791 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972771 4791 flags.go:64] FLAG: --cgroup-root="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972780 4791 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972789 4791 flags.go:64] FLAG: --client-ca-file="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972798 4791 flags.go:64] FLAG: --cloud-config="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972807 4791 flags.go:64] FLAG: --cloud-provider="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972816 4791 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972826 4791 flags.go:64] FLAG: --cluster-domain="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972835 4791 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972845 4791 flags.go:64] FLAG: --config-dir="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972874 4791 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972885 4791 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972896 4791 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972905 4791 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972914 4791 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972924 4791 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972933 4791 flags.go:64] FLAG: --contention-profiling="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972942 4791 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972951 4791 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972960 4791 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972970 4791 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972980 4791 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972989 4791 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.972998 4791 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973012 4791 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973021 4791 flags.go:64] FLAG: --enable-server="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973030 4791 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973043 4791 flags.go:64] FLAG: --event-burst="100" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973052 4791 flags.go:64] FLAG: --event-qps="50" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973062 4791 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973071 4791 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973081 4791 flags.go:64] FLAG: --eviction-hard="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973092 4791 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973102 4791 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973143 4791 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973154 4791 flags.go:64] FLAG: --eviction-soft="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973164 4791 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973173 4791 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973182 4791 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973192 4791 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973201 4791 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973210 4791 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973219 4791 flags.go:64] FLAG: --feature-gates="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973230 4791 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973239 4791 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973248 4791 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973258 4791 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973267 4791 flags.go:64] FLAG: --healthz-port="10248" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973276 4791 flags.go:64] FLAG: --help="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973285 4791 flags.go:64] FLAG: --hostname-override="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973293 4791 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973302 4791 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973312 4791 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973321 4791 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973329 4791 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973338 4791 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973350 4791 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973359 4791 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973380 4791 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973389 4791 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973399 4791 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973408 4791 flags.go:64] FLAG: --kube-reserved="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973417 4791 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973426 4791 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973435 4791 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973444 4791 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973453 4791 flags.go:64] FLAG: --lock-file="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973463 4791 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973471 4791 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973481 4791 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973502 4791 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973512 4791 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973522 4791 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973531 4791 flags.go:64] FLAG: --logging-format="text" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973539 4791 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973549 4791 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973558 4791 flags.go:64] FLAG: --manifest-url="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973567 4791 flags.go:64] FLAG: --manifest-url-header="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973578 4791 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973587 4791 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973598 4791 flags.go:64] FLAG: --max-pods="110" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973607 4791 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973616 4791 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973626 4791 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973634 4791 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973644 4791 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973652 4791 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973661 4791 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973684 4791 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973693 4791 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973702 4791 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973711 4791 flags.go:64] FLAG: --pod-cidr="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973721 4791 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973734 4791 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973743 4791 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973752 4791 flags.go:64] FLAG: --pods-per-core="0" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973761 4791 flags.go:64] FLAG: --port="10250" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973770 4791 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973779 4791 flags.go:64] FLAG: --provider-id="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973789 4791 flags.go:64] FLAG: --qos-reserved="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973798 4791 flags.go:64] FLAG: --read-only-port="10255" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973807 4791 flags.go:64] FLAG: --register-node="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973816 4791 flags.go:64] FLAG: --register-schedulable="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973825 4791 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973839 4791 flags.go:64] FLAG: --registry-burst="10" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973848 4791 flags.go:64] FLAG: --registry-qps="5" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973857 4791 flags.go:64] FLAG: --reserved-cpus="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973866 4791 flags.go:64] FLAG: --reserved-memory="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973882 4791 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973891 4791 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973900 4791 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973909 4791 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973918 4791 flags.go:64] FLAG: --runonce="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973928 4791 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973938 4791 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973947 4791 flags.go:64] FLAG: --seccomp-default="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973956 4791 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973965 4791 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973974 4791 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973984 4791 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.973993 4791 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974004 4791 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974013 4791 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974022 4791 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974032 4791 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974041 4791 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974051 4791 flags.go:64] FLAG: --system-cgroups="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974060 4791 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974073 4791 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974083 4791 flags.go:64] FLAG: --tls-cert-file="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974091 4791 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974102 4791 flags.go:64] FLAG: --tls-min-version="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974135 4791 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974144 4791 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974153 4791 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974162 4791 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974172 4791 flags.go:64] FLAG: --v="2" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974183 4791 flags.go:64] FLAG: --version="false" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974194 4791 flags.go:64] FLAG: --vmodule="" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974204 4791 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.974217 4791 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974420 4791 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974432 4791 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974441 4791 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974450 4791 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974458 4791 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974466 4791 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974475 4791 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974483 4791 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974492 4791 feature_gate.go:330] unrecognized feature gate: Example Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974500 4791 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974507 4791 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974515 4791 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974525 4791 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974533 4791 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974544 4791 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974554 4791 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974563 4791 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974571 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974579 4791 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974586 4791 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974594 4791 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974603 4791 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974610 4791 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974618 4791 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974626 4791 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974634 4791 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974642 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974649 4791 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974657 4791 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974665 4791 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974675 4791 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974688 4791 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974697 4791 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974706 4791 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974714 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974722 4791 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974730 4791 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974737 4791 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974746 4791 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974754 4791 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974762 4791 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974770 4791 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974778 4791 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974785 4791 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974796 4791 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974803 4791 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974812 4791 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974819 4791 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974827 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974834 4791 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974842 4791 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974850 4791 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974861 4791 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974870 4791 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974878 4791 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974887 4791 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974896 4791 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974905 4791 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974913 4791 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974920 4791 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974928 4791 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974935 4791 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974943 4791 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974956 4791 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974965 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974972 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974980 4791 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974988 4791 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.974996 4791 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.975004 4791 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.975011 4791 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.975024 4791 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.987753 4791 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.987797 4791 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.987930 4791 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.987944 4791 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.987953 4791 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.987962 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.987971 4791 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.987979 4791 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.987988 4791 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.987996 4791 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988004 4791 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988012 4791 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988020 4791 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988029 4791 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988037 4791 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988044 4791 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988053 4791 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988062 4791 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988071 4791 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988080 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988089 4791 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988099 4791 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988159 4791 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988180 4791 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988191 4791 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988199 4791 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988207 4791 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988215 4791 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988223 4791 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988231 4791 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988243 4791 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988256 4791 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988266 4791 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988275 4791 feature_gate.go:330] unrecognized feature gate: Example Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988284 4791 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988292 4791 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988302 4791 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988313 4791 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988322 4791 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988331 4791 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988340 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988348 4791 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988356 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988364 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988372 4791 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988380 4791 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988388 4791 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988395 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988403 4791 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988411 4791 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988419 4791 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988426 4791 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988434 4791 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988442 4791 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988452 4791 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988462 4791 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988471 4791 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988479 4791 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988488 4791 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988496 4791 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988504 4791 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988512 4791 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988519 4791 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988527 4791 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988535 4791 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988543 4791 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988550 4791 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988558 4791 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988566 4791 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988574 4791 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988582 4791 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988590 4791 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988601 4791 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.988617 4791 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988880 4791 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988893 4791 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988902 4791 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988912 4791 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988920 4791 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988928 4791 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988937 4791 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988946 4791 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988955 4791 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988964 4791 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988974 4791 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988982 4791 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988990 4791 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.988998 4791 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989006 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989015 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989022 4791 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989031 4791 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989039 4791 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989048 4791 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989055 4791 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989063 4791 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989072 4791 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989080 4791 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989088 4791 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989096 4791 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989135 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989148 4791 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989158 4791 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989167 4791 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989175 4791 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989183 4791 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989191 4791 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989198 4791 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989207 4791 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989216 4791 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989224 4791 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989232 4791 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989241 4791 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989249 4791 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989257 4791 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989265 4791 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989275 4791 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989285 4791 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989293 4791 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989302 4791 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989310 4791 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989318 4791 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989327 4791 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989336 4791 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989343 4791 feature_gate.go:330] unrecognized feature gate: Example Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989351 4791 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989359 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989367 4791 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989374 4791 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989382 4791 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989390 4791 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989398 4791 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989406 4791 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989414 4791 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989421 4791 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989432 4791 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989441 4791 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989451 4791 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989462 4791 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989470 4791 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989478 4791 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989487 4791 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989495 4791 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989503 4791 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 00:05:42 crc kubenswrapper[4791]: W0217 00:05:42.989512 4791 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.989524 4791 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.989793 4791 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.995874 4791 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.996009 4791 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.998021 4791 server.go:997] "Starting client certificate rotation" Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.998070 4791 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.998397 4791 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-07 16:11:19.92851701 +0000 UTC Feb 17 00:05:42 crc kubenswrapper[4791]: I0217 00:05:42.998514 4791 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.023552 4791 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.026430 4791 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.029173 4791 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.043406 4791 log.go:25] "Validated CRI v1 runtime API" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.082687 4791 log.go:25] "Validated CRI v1 image API" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.085034 4791 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.089539 4791 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-00-00-44-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.089577 4791 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.114935 4791 manager.go:217] Machine: {Timestamp:2026-02-17 00:05:43.112405425 +0000 UTC m=+0.591918042 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0568b345-f1c6-4fd9-8232-7bcd76fcbb73 BootID:e24423ac-3d71-4c2f-893f-f52232a36e88 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2a:94:b3 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2a:94:b3 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:78:64:bc Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e6:3c:39 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:64:ff:a2 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:bb:3b:cb Speed:-1 Mtu:1496} {Name:eth10 MacAddress:2a:18:df:aa:94:c3 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:02:13:b6:4a:01:35 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.115471 4791 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.115641 4791 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.117835 4791 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.118247 4791 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.118317 4791 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.118639 4791 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.118659 4791 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.119441 4791 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.119475 4791 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.119728 4791 state_mem.go:36] "Initialized new in-memory state store" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.119902 4791 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.123940 4791 kubelet.go:418] "Attempting to sync node with API server" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.123974 4791 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.123998 4791 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.124019 4791 kubelet.go:324] "Adding apiserver pod source" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.124035 4791 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.128339 4791 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.130146 4791 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 00:05:43 crc kubenswrapper[4791]: W0217 00:05:43.132496 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.132717 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:43 crc kubenswrapper[4791]: W0217 00:05:43.132488 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.132793 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.133056 4791 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135572 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135615 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135631 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135646 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135668 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135681 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135694 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135715 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135732 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135745 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135772 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.135786 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.138250 4791 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.138827 4791 server.go:1280] "Started kubelet" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.138961 4791 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.139770 4791 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.145076 4791 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.145816 4791 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 00:05:43 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.146973 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.147036 4791 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.147138 4791 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.147104 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 10:15:25.27197768 +0000 UTC Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.147176 4791 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.147200 4791 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.147264 4791 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.154806 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="200ms" Feb 17 00:05:43 crc kubenswrapper[4791]: W0217 00:05:43.155405 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.155725 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.155835 4791 server.go:460] "Adding debug handlers to kubelet server" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.156718 4791 factory.go:153] Registering CRI-O factory Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.156768 4791 factory.go:221] Registration of the crio container factory successfully Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.156891 4791 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.156911 4791 factory.go:55] Registering systemd factory Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.156928 4791 factory.go:221] Registration of the systemd container factory successfully Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.156972 4791 factory.go:103] Registering Raw factory Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.157001 4791 manager.go:1196] Started watching for new ooms in manager Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.155074 4791 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1894dfe8802564cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 00:05:43.138788555 +0000 UTC m=+0.618301122,LastTimestamp:2026-02-17 00:05:43.138788555 +0000 UTC m=+0.618301122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.157896 4791 manager.go:319] Starting recovery of all containers Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163456 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163526 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163560 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163584 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163603 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163622 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163639 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163657 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163678 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163695 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163713 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163731 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163751 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163770 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163787 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163807 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163824 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163844 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163862 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163919 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163937 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163955 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163973 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.163993 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164012 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164029 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164052 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164071 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164089 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164135 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164155 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164182 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164209 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164227 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164246 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164265 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164284 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164301 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164320 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164339 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164358 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164376 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164394 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164412 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164429 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164447 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164467 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164487 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164505 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164526 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164549 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164574 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164606 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164679 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164715 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164749 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164781 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164812 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164840 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164873 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164902 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164929 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164960 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.164991 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165020 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165051 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165077 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165134 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165162 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165180 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165206 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165224 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165241 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165258 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165276 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165293 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165309 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165328 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165345 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.165366 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.170703 4791 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.170808 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.170879 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.170911 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.170937 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.170971 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.170994 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171026 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171049 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171070 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171098 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171147 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171175 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171203 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171226 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171257 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171281 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171304 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171332 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171354 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171383 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171404 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171425 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171454 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171476 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171535 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171586 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171625 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171664 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171698 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171725 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171756 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171781 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171812 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171843 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171866 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171895 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171914 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171933 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171960 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.171979 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172005 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172025 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172047 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172073 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172097 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172153 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172178 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172197 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172224 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172317 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172351 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172381 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172408 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172446 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172470 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172498 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172518 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172539 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172568 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172591 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172651 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172680 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172702 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172731 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172753 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172775 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172801 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172824 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172852 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172873 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172898 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172958 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.172989 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173020 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173040 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173061 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173088 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173142 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173177 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173197 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173218 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173243 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173262 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173281 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173306 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173324 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173349 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173368 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173389 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173417 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173439 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173465 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173485 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173503 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173528 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173550 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173578 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173601 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173621 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173647 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173672 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173697 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173717 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173737 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173820 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173846 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173873 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173892 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173913 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173947 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173967 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.173988 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174018 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174038 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174063 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174085 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174136 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174165 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174186 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174212 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174231 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174251 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174282 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174301 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174325 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174346 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174367 4791 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174388 4791 reconstruct.go:97] "Volume reconstruction finished" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.174408 4791 reconciler.go:26] "Reconciler: start to sync state" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.187307 4791 manager.go:324] Recovery completed Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.207923 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.211222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.211296 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.211325 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.213201 4791 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.213238 4791 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.213295 4791 state_mem.go:36] "Initialized new in-memory state store" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.215717 4791 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.218708 4791 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.218839 4791 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.218940 4791 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.219173 4791 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 00:05:43 crc kubenswrapper[4791]: W0217 00:05:43.222973 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.223077 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.230220 4791 policy_none.go:49] "None policy: Start" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.234045 4791 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.234076 4791 state_mem.go:35] "Initializing new in-memory state store" Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.248167 4791 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.301310 4791 manager.go:334] "Starting Device Plugin manager" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.301383 4791 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.301403 4791 server.go:79] "Starting device plugin registration server" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.302043 4791 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.302078 4791 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.302461 4791 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.302674 4791 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.302707 4791 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.318072 4791 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.320386 4791 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.320500 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.321829 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.321911 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.321930 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.322260 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.323327 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.323387 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.323407 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.323431 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.323443 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.323541 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.323737 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.323793 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324486 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324525 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324574 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324603 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324740 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324783 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324832 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324934 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.324964 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.325912 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.325939 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.325955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.325977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.325996 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.326009 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.326096 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.326191 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.326220 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.327020 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.327057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.327071 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.327175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.327210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.327229 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.327258 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.327287 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.327964 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.328005 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.328020 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.355616 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="400ms" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.376981 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.377066 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.377176 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.377233 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.377281 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.377329 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.377379 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.377453 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.377589 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.377658 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.377711 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.379535 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.379677 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.379803 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.379864 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: W0217 00:05:43.399078 4791 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/cpuset.cpus.effective": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/cpuset.cpus.effective: no such device Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.403003 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.404462 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.404526 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.404545 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.404588 4791 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.405354 4791 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.481758 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.481835 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.481867 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.481903 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.481934 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.481962 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.481991 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482021 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482054 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482084 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482146 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482179 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482210 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482241 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482270 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482551 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482718 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482759 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482772 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482813 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482854 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482865 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482905 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482916 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482955 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482965 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.482959 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.483003 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.483014 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.483063 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.605864 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.607401 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.607449 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.607467 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.607499 4791 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.608024 4791 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.659752 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.673310 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.693365 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.712858 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: W0217 00:05:43.716186 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-52eae3d930bcf57fe699d62dc736773de6b364797bead37fa0af6571f1c65cce WatchSource:0}: Error finding container 52eae3d930bcf57fe699d62dc736773de6b364797bead37fa0af6571f1c65cce: Status 404 returned error can't find the container with id 52eae3d930bcf57fe699d62dc736773de6b364797bead37fa0af6571f1c65cce Feb 17 00:05:43 crc kubenswrapper[4791]: W0217 00:05:43.719428 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-7206fa68d2aef75ec039c11c5925b1de665b758a385e2592b660ac9b9ad5e7d5 WatchSource:0}: Error finding container 7206fa68d2aef75ec039c11c5925b1de665b758a385e2592b660ac9b9ad5e7d5: Status 404 returned error can't find the container with id 7206fa68d2aef75ec039c11c5925b1de665b758a385e2592b660ac9b9ad5e7d5 Feb 17 00:05:43 crc kubenswrapper[4791]: I0217 00:05:43.723215 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 00:05:43 crc kubenswrapper[4791]: W0217 00:05:43.727069 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-de7aa5dedf102282352abc6e9c97f2ab952075e0759fca591d32adc81c0f3dc0 WatchSource:0}: Error finding container de7aa5dedf102282352abc6e9c97f2ab952075e0759fca591d32adc81c0f3dc0: Status 404 returned error can't find the container with id de7aa5dedf102282352abc6e9c97f2ab952075e0759fca591d32adc81c0f3dc0 Feb 17 00:05:43 crc kubenswrapper[4791]: W0217 00:05:43.739881 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e6827d5be6c8bdc8f75736e9e5e13a158a2c1f6344e347176e6fc91df05dc0bf WatchSource:0}: Error finding container e6827d5be6c8bdc8f75736e9e5e13a158a2c1f6344e347176e6fc91df05dc0bf: Status 404 returned error can't find the container with id e6827d5be6c8bdc8f75736e9e5e13a158a2c1f6344e347176e6fc91df05dc0bf Feb 17 00:05:43 crc kubenswrapper[4791]: W0217 00:05:43.743346 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-a58cd7b272a667a315c7ef6bd38ab31a962545a2ca9a19238f7b21148520ccfa WatchSource:0}: Error finding container a58cd7b272a667a315c7ef6bd38ab31a962545a2ca9a19238f7b21148520ccfa: Status 404 returned error can't find the container with id a58cd7b272a667a315c7ef6bd38ab31a962545a2ca9a19238f7b21148520ccfa Feb 17 00:05:43 crc kubenswrapper[4791]: E0217 00:05:43.756646 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="800ms" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.008196 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.009867 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.009921 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.009940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.009975 4791 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 00:05:44 crc kubenswrapper[4791]: E0217 00:05:44.010624 4791 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.141307 4791 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.147558 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 23:04:29.591813727 +0000 UTC Feb 17 00:05:44 crc kubenswrapper[4791]: W0217 00:05:44.218713 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:44 crc kubenswrapper[4791]: E0217 00:05:44.218790 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.229196 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e6827d5be6c8bdc8f75736e9e5e13a158a2c1f6344e347176e6fc91df05dc0bf"} Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.232690 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"de7aa5dedf102282352abc6e9c97f2ab952075e0759fca591d32adc81c0f3dc0"} Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.234192 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"52eae3d930bcf57fe699d62dc736773de6b364797bead37fa0af6571f1c65cce"} Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.235933 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7206fa68d2aef75ec039c11c5925b1de665b758a385e2592b660ac9b9ad5e7d5"} Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.238161 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a58cd7b272a667a315c7ef6bd38ab31a962545a2ca9a19238f7b21148520ccfa"} Feb 17 00:05:44 crc kubenswrapper[4791]: W0217 00:05:44.462751 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:44 crc kubenswrapper[4791]: E0217 00:05:44.463357 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:44 crc kubenswrapper[4791]: W0217 00:05:44.515367 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:44 crc kubenswrapper[4791]: E0217 00:05:44.515457 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:44 crc kubenswrapper[4791]: W0217 00:05:44.529933 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:44 crc kubenswrapper[4791]: E0217 00:05:44.530021 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:44 crc kubenswrapper[4791]: E0217 00:05:44.557317 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="1.6s" Feb 17 00:05:44 crc kubenswrapper[4791]: E0217 00:05:44.680153 4791 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1894dfe8802564cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 00:05:43.138788555 +0000 UTC m=+0.618301122,LastTimestamp:2026-02-17 00:05:43.138788555 +0000 UTC m=+0.618301122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.811218 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.812970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.813068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.813092 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:44 crc kubenswrapper[4791]: I0217 00:05:44.813197 4791 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 00:05:44 crc kubenswrapper[4791]: E0217 00:05:44.813884 4791 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.035982 4791 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 00:05:45 crc kubenswrapper[4791]: E0217 00:05:45.037619 4791 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.141269 4791 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.148404 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:16:17.976835861 +0000 UTC Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.247156 4791 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc" exitCode=0 Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.247298 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.247292 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc"} Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.249069 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.249153 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.249180 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.251066 4791 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c" exitCode=0 Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.251181 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c"} Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.251214 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.253789 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.253839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.253859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.255935 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba"} Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.256009 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f"} Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.256045 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86"} Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.259329 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf" exitCode=0 Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.259414 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf"} Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.259469 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.261331 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.261393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.261414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.262976 4791 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8" exitCode=0 Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.263027 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8"} Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.263169 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.264441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.264490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.264510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.266658 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.267962 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.268011 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:45 crc kubenswrapper[4791]: I0217 00:05:45.268044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.140631 4791 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.148524 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 01:27:21.287275889 +0000 UTC Feb 17 00:05:46 crc kubenswrapper[4791]: E0217 00:05:46.158199 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="3.2s" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.272174 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.272153 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4"} Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.273410 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.273436 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.273444 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.277371 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373"} Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.277426 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0"} Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.277450 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c"} Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.277563 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118"} Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.281361 4791 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9" exitCode=0 Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.281431 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9"} Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.281504 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.285656 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.285687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.285696 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.289725 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66"} Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.289827 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.291377 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.291429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.291445 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.295240 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846"} Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.295286 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646"} Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.295301 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d"} Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.295812 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.298185 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.298210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.298218 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.414916 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.416180 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.416220 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.416234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.416259 4791 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 00:05:46 crc kubenswrapper[4791]: E0217 00:05:46.416628 4791 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Feb 17 00:05:46 crc kubenswrapper[4791]: W0217 00:05:46.592699 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 17 00:05:46 crc kubenswrapper[4791]: E0217 00:05:46.593132 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 17 00:05:46 crc kubenswrapper[4791]: I0217 00:05:46.748337 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.149074 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 05:46:36.027539069 +0000 UTC Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.302104 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c"} Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.302614 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.308368 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.308449 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.308486 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.310986 4791 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3" exitCode=0 Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.311232 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.311284 4791 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.311366 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.311240 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.311259 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.311721 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3"} Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313308 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313376 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313314 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313381 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313479 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313515 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313405 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.313602 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:47 crc kubenswrapper[4791]: I0217 00:05:47.402395 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.122146 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.150211 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 21:27:45.008690217 +0000 UTC Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.318320 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40"} Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.318381 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c"} Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.318403 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2"} Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.318459 4791 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.318552 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.318591 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.319799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.319833 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.319843 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.320371 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.320425 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.320446 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:48 crc kubenswrapper[4791]: I0217 00:05:48.456697 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.150913 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 10:44:32.594443016 +0000 UTC Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.327057 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d"} Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.327163 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.327194 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae"} Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.327138 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.327259 4791 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.327586 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.328875 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.328980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.329068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.329157 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.329200 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.329222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.329244 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.329284 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.329303 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.358892 4791 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.617030 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.618640 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.618701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.618717 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.618748 4791 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.748453 4791 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 00:05:49 crc kubenswrapper[4791]: I0217 00:05:49.748533 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 00:05:50 crc kubenswrapper[4791]: I0217 00:05:50.151028 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 10:31:25.491268608 +0000 UTC Feb 17 00:05:50 crc kubenswrapper[4791]: I0217 00:05:50.330617 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:50 crc kubenswrapper[4791]: I0217 00:05:50.331874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:50 crc kubenswrapper[4791]: I0217 00:05:50.331908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:50 crc kubenswrapper[4791]: I0217 00:05:50.331919 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:50 crc kubenswrapper[4791]: I0217 00:05:50.935734 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:50 crc kubenswrapper[4791]: I0217 00:05:50.936028 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:50 crc kubenswrapper[4791]: I0217 00:05:50.937567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:50 crc kubenswrapper[4791]: I0217 00:05:50.937635 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:50 crc kubenswrapper[4791]: I0217 00:05:50.937652 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:51 crc kubenswrapper[4791]: I0217 00:05:51.151634 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 05:18:50.689858374 +0000 UTC Feb 17 00:05:52 crc kubenswrapper[4791]: I0217 00:05:52.036664 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:05:52 crc kubenswrapper[4791]: I0217 00:05:52.036962 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:52 crc kubenswrapper[4791]: I0217 00:05:52.038742 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:52 crc kubenswrapper[4791]: I0217 00:05:52.038800 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:52 crc kubenswrapper[4791]: I0217 00:05:52.038825 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:52 crc kubenswrapper[4791]: I0217 00:05:52.152205 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 16:45:34.149881563 +0000 UTC Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.035925 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.036197 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.037658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.037705 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.037725 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.087083 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.087364 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.088933 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.088989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.089010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:53 crc kubenswrapper[4791]: I0217 00:05:53.152933 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 06:46:40.362285219 +0000 UTC Feb 17 00:05:53 crc kubenswrapper[4791]: E0217 00:05:53.319094 4791 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 00:05:54 crc kubenswrapper[4791]: I0217 00:05:54.111182 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 00:05:54 crc kubenswrapper[4791]: I0217 00:05:54.111494 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:54 crc kubenswrapper[4791]: I0217 00:05:54.113211 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:54 crc kubenswrapper[4791]: I0217 00:05:54.113293 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:54 crc kubenswrapper[4791]: I0217 00:05:54.113325 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:54 crc kubenswrapper[4791]: I0217 00:05:54.153885 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 10:06:14.908424303 +0000 UTC Feb 17 00:05:55 crc kubenswrapper[4791]: I0217 00:05:55.154528 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 01:56:35.468599848 +0000 UTC Feb 17 00:05:55 crc kubenswrapper[4791]: I0217 00:05:55.400376 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:55 crc kubenswrapper[4791]: I0217 00:05:55.400682 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:55 crc kubenswrapper[4791]: I0217 00:05:55.402866 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:55 crc kubenswrapper[4791]: I0217 00:05:55.402925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:55 crc kubenswrapper[4791]: I0217 00:05:55.402943 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:55 crc kubenswrapper[4791]: I0217 00:05:55.411206 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:56 crc kubenswrapper[4791]: I0217 00:05:56.044212 4791 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 00:05:56 crc kubenswrapper[4791]: I0217 00:05:56.044306 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 00:05:56 crc kubenswrapper[4791]: I0217 00:05:56.155528 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 14:30:57.36948466 +0000 UTC Feb 17 00:05:56 crc kubenswrapper[4791]: I0217 00:05:56.350378 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:56 crc kubenswrapper[4791]: I0217 00:05:56.351377 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:56 crc kubenswrapper[4791]: I0217 00:05:56.351451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:56 crc kubenswrapper[4791]: I0217 00:05:56.351471 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:56 crc kubenswrapper[4791]: I0217 00:05:56.358943 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:05:57 crc kubenswrapper[4791]: W0217 00:05:57.046051 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.046431 4791 trace.go:236] Trace[110213120]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 00:05:47.044) (total time: 10001ms): Feb 17 00:05:57 crc kubenswrapper[4791]: Trace[110213120]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:05:57.046) Feb 17 00:05:57 crc kubenswrapper[4791]: Trace[110213120]: [10.001867122s] [10.001867122s] END Feb 17 00:05:57 crc kubenswrapper[4791]: E0217 00:05:57.046560 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.142697 4791 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.156091 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 04:12:54.320934492 +0000 UTC Feb 17 00:05:57 crc kubenswrapper[4791]: W0217 00:05:57.163229 4791 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.163358 4791 trace.go:236] Trace[1694122038]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 00:05:47.162) (total time: 10001ms): Feb 17 00:05:57 crc kubenswrapper[4791]: Trace[1694122038]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (00:05:57.163) Feb 17 00:05:57 crc kubenswrapper[4791]: Trace[1694122038]: [10.001038213s] [10.001038213s] END Feb 17 00:05:57 crc kubenswrapper[4791]: E0217 00:05:57.163412 4791 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.266524 4791 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.266609 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.282024 4791 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.282161 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.353099 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.354084 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.354139 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.354149 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.411999 4791 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]log ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]etcd ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/generic-apiserver-start-informers ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/priority-and-fairness-filter ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/start-apiextensions-informers ok Feb 17 00:05:57 crc kubenswrapper[4791]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Feb 17 00:05:57 crc kubenswrapper[4791]: [-]poststarthook/crd-informer-synced failed: reason withheld Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/start-system-namespaces-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 17 00:05:57 crc kubenswrapper[4791]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 17 00:05:57 crc kubenswrapper[4791]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/bootstrap-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/start-kube-aggregator-informers ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/apiservice-registration-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/apiservice-discovery-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]autoregister-completion ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/apiservice-openapi-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 17 00:05:57 crc kubenswrapper[4791]: livez check failed Feb 17 00:05:57 crc kubenswrapper[4791]: I0217 00:05:57.412066 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:05:58 crc kubenswrapper[4791]: I0217 00:05:58.157478 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 20:23:54.258048037 +0000 UTC Feb 17 00:05:59 crc kubenswrapper[4791]: I0217 00:05:59.157965 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 23:25:36.138704725 +0000 UTC Feb 17 00:05:59 crc kubenswrapper[4791]: I0217 00:05:59.748561 4791 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 00:05:59 crc kubenswrapper[4791]: I0217 00:05:59.748643 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 00:06:00 crc kubenswrapper[4791]: I0217 00:06:00.158253 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 13:23:07.201318549 +0000 UTC Feb 17 00:06:01 crc kubenswrapper[4791]: I0217 00:06:01.158704 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 04:04:01.96312516 +0000 UTC Feb 17 00:06:01 crc kubenswrapper[4791]: I0217 00:06:01.505017 4791 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.036846 4791 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.159307 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 13:48:03.291807453 +0000 UTC Feb 17 00:06:02 crc kubenswrapper[4791]: E0217 00:06:02.284619 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.287217 4791 trace.go:236] Trace[675468661]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 00:05:50.609) (total time: 11677ms): Feb 17 00:06:02 crc kubenswrapper[4791]: Trace[675468661]: ---"Objects listed" error: 11677ms (00:06:02.287) Feb 17 00:06:02 crc kubenswrapper[4791]: Trace[675468661]: [11.677164494s] [11.677164494s] END Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.287266 4791 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.290092 4791 trace.go:236] Trace[1113905256]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 00:05:47.279) (total time: 15010ms): Feb 17 00:06:02 crc kubenswrapper[4791]: Trace[1113905256]: ---"Objects listed" error: 15010ms (00:06:02.289) Feb 17 00:06:02 crc kubenswrapper[4791]: Trace[1113905256]: [15.010320493s] [15.010320493s] END Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.290176 4791 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:02 crc kubenswrapper[4791]: E0217 00:06:02.290940 4791 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.291207 4791 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.298630 4791 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.323584 4791 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57938->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.323682 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57938->192.168.126.11:17697: read: connection reset by peer" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.368788 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.371696 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c" exitCode=255 Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.371750 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c"} Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.408202 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.427220 4791 scope.go:117] "RemoveContainer" containerID="f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c" Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.490773 4791 csr.go:261] certificate signing request csr-lxl7v is approved, waiting to be issued Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.529938 4791 csr.go:257] certificate signing request csr-lxl7v is issued Feb 17 00:06:02 crc kubenswrapper[4791]: I0217 00:06:02.998039 4791 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 00:06:02 crc kubenswrapper[4791]: W0217 00:06:02.998205 4791 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 00:06:02 crc kubenswrapper[4791]: W0217 00:06:02.998233 4791 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 00:06:02 crc kubenswrapper[4791]: W0217 00:06:02.998243 4791 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.076618 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.090322 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.127967 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.136902 4791 apiserver.go:52] "Watching apiserver" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.141798 4791 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.142152 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-machine-config-operator/machine-config-daemon-9klkw","openshift-multus/multus-299s7","openshift-multus/multus-additional-cni-plugins-8stwf","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-dns/node-resolver-dl4gt","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c"] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.142532 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.144522 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.144579 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.144609 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.144907 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.145083 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.145229 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.145239 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.145254 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.145282 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.145370 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.145709 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.145798 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.148299 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.148843 4791 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.150291 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.152500 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.154141 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.160017 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 18:23:01.931267565 +0000 UTC Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.161966 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.162026 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.162044 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.166918 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.167147 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.167025 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.167651 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.167662 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.167859 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.169993 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.170024 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.170069 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.170988 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.174169 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.174203 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.176064 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.176266 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.177687 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.177920 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.177917 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.178477 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196682 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196725 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196748 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196792 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196818 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196837 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196858 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196877 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196897 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196918 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196940 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196964 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.196987 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197011 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197020 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197033 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197092 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197131 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197149 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197170 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197173 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197188 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197249 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197277 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197301 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197323 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197348 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197371 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197397 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197422 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197448 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197473 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197497 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197547 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197573 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197594 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197617 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197639 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197666 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197723 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197747 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197769 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197791 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197813 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197836 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197857 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197880 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197902 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197947 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197971 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197994 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198018 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198039 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198060 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198081 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198107 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198148 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198170 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198190 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198240 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198263 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198287 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198310 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198331 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198352 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197421 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198377 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197580 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198385 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198401 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197631 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197676 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197812 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197870 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197902 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197928 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.197994 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198146 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198165 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198234 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198279 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198341 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198349 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198393 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198512 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198541 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198571 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198594 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198644 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198668 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198691 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198714 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198738 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198761 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198783 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198805 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198836 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198858 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198878 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198899 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198922 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198944 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198965 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198988 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199010 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199033 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199059 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199083 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199126 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199151 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199172 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199197 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199220 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199245 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199269 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199289 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199310 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199331 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199354 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199374 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199423 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199450 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199472 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199494 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199516 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199536 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199579 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199601 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199623 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199645 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199666 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199686 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199707 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199730 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199751 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199773 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199797 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199818 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199840 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199863 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199887 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199908 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199929 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199950 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199971 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199990 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200010 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200028 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200050 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200070 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200089 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200255 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200282 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200305 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200326 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200347 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200368 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200398 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200419 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200440 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200460 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200481 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200505 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200526 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200549 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200570 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200592 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200616 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200639 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200664 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200686 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200707 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200728 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200749 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200774 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200815 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200838 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200860 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200884 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200906 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200927 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200948 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200970 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200991 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201012 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201036 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201058 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201079 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201103 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201147 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201170 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201192 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201219 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201243 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201265 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201288 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201313 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201334 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201364 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201387 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201410 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201435 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201462 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201486 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201510 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201534 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201559 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201583 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201607 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201631 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201655 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201681 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201705 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201728 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201788 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201819 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201842 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-kubelet\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201867 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-etc-kubernetes\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201889 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b819236-9682-4ef9-8653-516f45335793-hosts-file\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201919 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201944 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02a3a228-86d6-4d54-ad63-0d36c9d59af5-proxy-tls\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201968 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02a3a228-86d6-4d54-ad63-0d36c9d59af5-mcd-auth-proxy-config\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201991 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-socket-dir-parent\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202014 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rswnq\" (UniqueName: \"kubernetes.io/projected/1104c109-74aa-4fc4-8a1b-914a0d5803a4-kube-api-access-rswnq\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202043 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202069 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202092 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-k8s-cni-cncf-io\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202136 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202166 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202189 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202212 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-netns\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202232 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-conf-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202285 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202310 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cni-binary-copy\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202333 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-multus-certs\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202355 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4cf8\" (UniqueName: \"kubernetes.io/projected/1b819236-9682-4ef9-8653-516f45335793-kube-api-access-l4cf8\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202379 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-os-release\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202411 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202432 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nwqf\" (UniqueName: \"kubernetes.io/projected/eab5901c-ba92-4f20-9960-ac7cfd67b25a-kube-api-access-5nwqf\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202453 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-system-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202474 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-os-release\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202498 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-bin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202524 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-binary-copy\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202550 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-multus\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202577 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202600 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/02a3a228-86d6-4d54-ad63-0d36c9d59af5-rootfs\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202625 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-daemon-config\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202650 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-system-cni-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202675 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202698 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cnibin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202726 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202752 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202776 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cnibin\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202804 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202831 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202857 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202881 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rgm\" (UniqueName: \"kubernetes.io/projected/02a3a228-86d6-4d54-ad63-0d36c9d59af5-kube-api-access-c6rgm\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202903 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-hostroot\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202957 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202973 4791 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202989 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203005 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203023 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203036 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203049 4791 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203062 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203075 4791 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203087 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203100 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203134 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203148 4791 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203160 4791 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203174 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203188 4791 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203202 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203218 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203232 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198439 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198576 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198709 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198707 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198767 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198800 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198820 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198892 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.198910 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199018 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199040 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199046 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199190 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199209 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199367 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199382 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199530 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.199734 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200317 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.200579 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201408 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201465 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201626 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201749 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201878 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.201927 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202201 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202481 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202499 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202716 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202780 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.202922 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203136 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203234 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203380 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203529 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203572 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203591 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203794 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203793 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203812 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203834 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.203949 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204003 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204206 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204271 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204291 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204496 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204634 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.204643 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205003 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205138 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205457 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205455 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205488 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205598 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205719 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205709 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205747 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205851 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.205921 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206040 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206080 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206196 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206250 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206324 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206376 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206437 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206547 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206764 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.206884 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.207093 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.207964 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.208632 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.208704 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:03.708687743 +0000 UTC m=+21.188200270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.209757 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.210875 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211108 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.211340 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211353 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211445 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211581 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211781 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.212235 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.212490 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.212556 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.212638 4791 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.211628 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.214537 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.214902 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.215097 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.215134 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.215282 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.216395 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.216873 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.217937 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.218254 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.218334 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.218688 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.219550 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220030 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220305 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220379 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220494 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220755 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.220831 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.221019 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.222629 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.223009 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.223294 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.223605 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.223672 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.224339 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.224670 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.225028 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.225190 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.225453 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.225520 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.225890 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.226251 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.227249 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.228424 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.228731 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.228990 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.229833 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.232162 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.232409 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.232574 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.232746 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.233027 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.233409 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.233692 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.234006 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.234448 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.234679 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.234783 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.235090 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.235225 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.235404 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.235466 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.235826 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.235915 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:03.735891594 +0000 UTC m=+21.215404121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.236267 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.236431 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.236849 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.236864 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.237653 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.237960 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.238132 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.238156 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.238286 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.238448 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.238722 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.238740 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.239028 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.240021 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.240280 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.240493 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.241437 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.241757 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.241822 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:03.741801681 +0000 UTC m=+21.221314208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.242162 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:03.742107111 +0000 UTC m=+21.221619638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.242298 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.242791 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.243025 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.250637 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.250706 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.250732 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.250749 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.251038 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.251088 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:03.751057894 +0000 UTC m=+21.230570411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.251222 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.251595 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.252028 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.259850 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.260079 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.260647 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.260780 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.261129 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.262203 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.262430 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.262609 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.262752 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.263024 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.263556 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.263622 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.264071 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.264110 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.264829 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.265099 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.265317 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.266832 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.267794 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.268394 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.269914 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.270892 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.272513 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.273920 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.274189 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.274515 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.275342 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.276512 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.277308 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.278268 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.279541 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.280368 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.281960 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.283351 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.283899 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.284411 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.284474 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.284502 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.284949 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.285262 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.285938 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.286973 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.287087 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.287525 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.287994 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.294099 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.296261 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304392 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cnibin\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304455 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rgm\" (UniqueName: \"kubernetes.io/projected/02a3a228-86d6-4d54-ad63-0d36c9d59af5-kube-api-access-c6rgm\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304477 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-hostroot\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304510 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304531 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-kubelet\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304551 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-etc-kubernetes\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304570 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b819236-9682-4ef9-8653-516f45335793-hosts-file\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304602 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02a3a228-86d6-4d54-ad63-0d36c9d59af5-proxy-tls\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304622 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02a3a228-86d6-4d54-ad63-0d36c9d59af5-mcd-auth-proxy-config\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304643 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-socket-dir-parent\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304664 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rswnq\" (UniqueName: \"kubernetes.io/projected/1104c109-74aa-4fc4-8a1b-914a0d5803a4-kube-api-access-rswnq\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304698 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-k8s-cni-cncf-io\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304716 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304737 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304759 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-netns\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304777 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-conf-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304797 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cni-binary-copy\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304817 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-multus-certs\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304836 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4cf8\" (UniqueName: \"kubernetes.io/projected/1b819236-9682-4ef9-8653-516f45335793-kube-api-access-l4cf8\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304855 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-os-release\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304887 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304917 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nwqf\" (UniqueName: \"kubernetes.io/projected/eab5901c-ba92-4f20-9960-ac7cfd67b25a-kube-api-access-5nwqf\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304937 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-system-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304956 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-os-release\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.304979 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-bin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305002 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-binary-copy\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305021 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-multus\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305052 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/02a3a228-86d6-4d54-ad63-0d36c9d59af5-rootfs\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305072 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-daemon-config\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305095 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-system-cni-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305130 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305152 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cnibin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305219 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305233 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305247 4791 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305260 4791 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305271 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305283 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305295 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305309 4791 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305322 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305333 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305346 4791 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305358 4791 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305370 4791 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305382 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305394 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305406 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305418 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305430 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305441 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305454 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305466 4791 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305478 4791 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305490 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305501 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305512 4791 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305523 4791 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305536 4791 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305549 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305560 4791 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305572 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305584 4791 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305595 4791 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305606 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305617 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305629 4791 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305640 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305652 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305696 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305708 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305720 4791 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305731 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305744 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305756 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305767 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305778 4791 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305790 4791 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305806 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305818 4791 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305832 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305844 4791 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305855 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305868 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305880 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305891 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305903 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305915 4791 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305927 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305938 4791 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305950 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305961 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305975 4791 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305987 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.305998 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306010 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306021 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306031 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306043 4791 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306054 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306065 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306076 4791 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306089 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306100 4791 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306130 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306143 4791 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306154 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306166 4791 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306179 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306191 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306204 4791 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306218 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306230 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306241 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306257 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306268 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306279 4791 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306290 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306302 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306313 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306324 4791 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306335 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306348 4791 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306359 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306370 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306382 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306394 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306405 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306415 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306427 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306439 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306450 4791 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306462 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306473 4791 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306485 4791 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306496 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306508 4791 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306519 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306530 4791 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306542 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306553 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306564 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306576 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306588 4791 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306600 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306611 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306622 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306633 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306645 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306656 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306667 4791 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306679 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306691 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306702 4791 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306713 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306724 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306734 4791 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306745 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306756 4791 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306767 4791 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306778 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306789 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306800 4791 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306811 4791 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306823 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306834 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306848 4791 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306859 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306870 4791 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306882 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306893 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306905 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306917 4791 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306928 4791 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306940 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306951 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306965 4791 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306976 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306987 4791 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.306998 4791 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307010 4791 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307021 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307032 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307043 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307055 4791 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307066 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307078 4791 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307089 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307100 4791 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307126 4791 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307138 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307149 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307160 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307173 4791 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307184 4791 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307195 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307207 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307218 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307232 4791 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307243 4791 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307254 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307265 4791 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307276 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307287 4791 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307299 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307365 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cnibin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307508 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307913 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-conf-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.307963 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-netns\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308030 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cnibin\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308303 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-hostroot\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308592 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308617 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-kubelet\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308622 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-cni-binary-copy\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308639 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-etc-kubernetes\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308668 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-multus-certs\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.308670 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b819236-9682-4ef9-8653-516f45335793-hosts-file\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.309013 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-os-release\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.309904 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.310734 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02a3a228-86d6-4d54-ad63-0d36c9d59af5-mcd-auth-proxy-config\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.310785 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-socket-dir-parent\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.313378 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-multus\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314134 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-system-cni-dir\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314733 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-os-release\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314741 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-binary-copy\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314829 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314893 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.314933 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-run-k8s-cni-cncf-io\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.315038 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-system-cni-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.315048 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.315126 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/02a3a228-86d6-4d54-ad63-0d36c9d59af5-rootfs\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.315176 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1104c109-74aa-4fc4-8a1b-914a0d5803a4-host-var-lib-cni-bin\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.316334 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eab5901c-ba92-4f20-9960-ac7cfd67b25a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.316838 4791 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.317445 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.317622 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.322473 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.323564 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.323727 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1104c109-74aa-4fc4-8a1b-914a0d5803a4-multus-daemon-config\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.323933 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eab5901c-ba92-4f20-9960-ac7cfd67b25a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.323997 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.325135 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.325405 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02a3a228-86d6-4d54-ad63-0d36c9d59af5-proxy-tls\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.326384 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.334950 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.336001 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.338638 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rswnq\" (UniqueName: \"kubernetes.io/projected/1104c109-74aa-4fc4-8a1b-914a0d5803a4-kube-api-access-rswnq\") pod \"multus-299s7\" (UID: \"1104c109-74aa-4fc4-8a1b-914a0d5803a4\") " pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.339256 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.340294 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.342846 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nwqf\" (UniqueName: \"kubernetes.io/projected/eab5901c-ba92-4f20-9960-ac7cfd67b25a-kube-api-access-5nwqf\") pod \"multus-additional-cni-plugins-8stwf\" (UID: \"eab5901c-ba92-4f20-9960-ac7cfd67b25a\") " pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.342979 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4cf8\" (UniqueName: \"kubernetes.io/projected/1b819236-9682-4ef9-8653-516f45335793-kube-api-access-l4cf8\") pod \"node-resolver-dl4gt\" (UID: \"1b819236-9682-4ef9-8653-516f45335793\") " pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.346665 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.347468 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.349013 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.349195 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6rgm\" (UniqueName: \"kubernetes.io/projected/02a3a228-86d6-4d54-ad63-0d36c9d59af5-kube-api-access-c6rgm\") pod \"machine-config-daemon-9klkw\" (UID: \"02a3a228-86d6-4d54-ad63-0d36c9d59af5\") " pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.349789 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.352447 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.354844 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.355650 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.361142 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.361789 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.363412 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.363993 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.365037 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.365798 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.366654 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.366979 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.367723 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.368321 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.376873 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.378993 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef"} Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.378988 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.386865 4791 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.396367 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.405018 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hldzt"] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.407971 4791 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.408001 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.408311 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.410690 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.410800 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.410818 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.410881 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.411699 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.411966 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.412321 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.412530 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.421921 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.432412 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.442559 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.451787 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.457929 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.459407 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.466709 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dl4gt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.472658 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.480323 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.487051 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.489383 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8stwf" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.496373 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.507739 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509076 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509230 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509315 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509393 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509488 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509635 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509668 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509694 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509720 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509742 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509792 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509835 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509874 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509893 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509951 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26vg\" (UniqueName: \"kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.509987 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.510014 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.510029 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.510044 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.510057 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.512490 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.522101 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.531436 4791 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 00:01:02 +0000 UTC, rotation deadline is 2027-01-01 21:02:45.135846023 +0000 UTC Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.531499 4791 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7652h56m41.604349062s for next certificate rotation Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.537967 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: W0217 00:06:03.549593 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6fb824bd106246af15923e4ceb716e56fbb9cb42b94c8ec09243aa18bbd00465 WatchSource:0}: Error finding container 6fb824bd106246af15923e4ceb716e56fbb9cb42b94c8ec09243aa18bbd00465: Status 404 returned error can't find the container with id 6fb824bd106246af15923e4ceb716e56fbb9cb42b94c8ec09243aa18bbd00465 Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.564480 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.569890 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-299s7" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.581821 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.604904 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.611385 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.611531 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.611560 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r26vg\" (UniqueName: \"kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.611478 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.611641 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612029 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612057 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612099 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612213 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612240 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612321 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612354 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612325 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612379 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612375 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612402 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612415 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612452 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612423 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612512 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612428 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612563 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612568 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612596 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612624 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612666 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612674 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612701 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612720 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612734 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612742 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612783 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612810 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.613735 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.612633 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.613825 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.613879 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.616537 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.619447 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.637560 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.663877 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.666824 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26vg\" (UniqueName: \"kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg\") pod \"ovnkube-node-hldzt\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.688477 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.700528 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.713765 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.713973 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:04.713955016 +0000 UTC m=+22.193467543 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.713957 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.720651 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.726941 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.737154 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.749781 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.764197 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.776412 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.788093 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.801446 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.817007 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.817066 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.817125 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.817160 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817318 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817338 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817351 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817405 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:04.81738325 +0000 UTC m=+22.296895777 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817463 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817494 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:04.817486984 +0000 UTC m=+22.296999511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817547 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817571 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817596 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817607 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817585 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:04.817574047 +0000 UTC m=+22.297086574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:03 crc kubenswrapper[4791]: E0217 00:06:03.817866 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:04.817851415 +0000 UTC m=+22.297363942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.822862 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.844720 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.853965 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:03 crc kubenswrapper[4791]: I0217 00:06:03.863060 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.160659 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 16:25:44.807070515 +0000 UTC Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.382795 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" exitCode=0 Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.382896 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.382953 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"8359c5871ee1aee2d63af5dec0cce97a0b6622d7bd312c2093b490d8e6067659"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.384870 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c" exitCode=0 Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.384950 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.384996 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerStarted","Data":"15db0927244844c736c031a7899f4eb3cbe334b39369dcf8dcdbcca675203ee9"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.386441 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.386491 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c65323782cfc3a851ec1f29e3d8a508c8f6cb90f787b2f4a3959638d4e1d03e3"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.388491 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerStarted","Data":"de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.388544 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerStarted","Data":"0504e7daa6d550e6b0ea30bcfb0365273e8a7d32d024bc1f1a472355f9e18036"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.389555 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.389583 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.389596 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6fb824bd106246af15923e4ceb716e56fbb9cb42b94c8ec09243aa18bbd00465"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.390814 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.390858 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.390878 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"55a7a6fbeb41808509bc1dbb654a2f37dc480cdd37aa343bf72f411031a80257"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.391523 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"540ca4fc76207bc22c76b263b8d30c65345cbebf01b8f18aa8f525088ed777ae"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.393687 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dl4gt" event={"ID":"1b819236-9682-4ef9-8653-516f45335793","Type":"ContainerStarted","Data":"68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.393718 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dl4gt" event={"ID":"1b819236-9682-4ef9-8653-516f45335793","Type":"ContainerStarted","Data":"15c0c51a7ca8fe26a9cfd09443f9ae3f4990df41e295376a8c1a10384ed8d9c6"} Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.394098 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.409647 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.419315 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.430616 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.440877 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.448909 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.461797 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.471550 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.479905 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.495758 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.514220 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.532695 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.545686 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.560450 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.573280 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.585631 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.599809 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.613730 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.628255 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.646194 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.664674 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.674344 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.691732 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.705249 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.718074 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.725220 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.725412 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:06.725386644 +0000 UTC m=+24.204899211 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.734440 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.749208 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:04Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.826644 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.826692 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.826717 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:04 crc kubenswrapper[4791]: I0217 00:06:04.826746 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826847 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826873 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826887 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826892 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826888 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826927 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826910 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826991 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.826946 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:06.826929418 +0000 UTC m=+24.306441955 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.827029 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:06.827007771 +0000 UTC m=+24.306520398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.827048 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:06.827038032 +0000 UTC m=+24.306550669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:04 crc kubenswrapper[4791]: E0217 00:06:04.827073 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:06.827064953 +0000 UTC m=+24.306577620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.161132 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 13:08:04.306743855 +0000 UTC Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.209148 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-k5kxc"] Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.209457 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.211548 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.211585 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.212276 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.219614 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.219650 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.219662 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.219725 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:05 crc kubenswrapper[4791]: E0217 00:06:05.219798 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:05 crc kubenswrapper[4791]: E0217 00:06:05.219964 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:05 crc kubenswrapper[4791]: E0217 00:06:05.220441 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.225972 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.228373 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.230911 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.232382 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.233851 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.240966 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.255536 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.269411 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.283209 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.296607 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.309683 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.330586 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f0a7811-6a89-456b-95ea-6c8e698479dd-serviceca\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.330635 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnb2k\" (UniqueName: \"kubernetes.io/projected/5f0a7811-6a89-456b-95ea-6c8e698479dd-kube-api-access-mnb2k\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.330716 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0a7811-6a89-456b-95ea-6c8e698479dd-host\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.332791 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.346557 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.375162 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.386222 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.400766 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.401155 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.401173 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.401186 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.401200 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.402832 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737" exitCode=0 Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.402918 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737"} Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.409617 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.431826 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0a7811-6a89-456b-95ea-6c8e698479dd-host\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.431906 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f0a7811-6a89-456b-95ea-6c8e698479dd-serviceca\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.431930 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnb2k\" (UniqueName: \"kubernetes.io/projected/5f0a7811-6a89-456b-95ea-6c8e698479dd-kube-api-access-mnb2k\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.432233 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f0a7811-6a89-456b-95ea-6c8e698479dd-host\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.433177 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f0a7811-6a89-456b-95ea-6c8e698479dd-serviceca\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.455567 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.477860 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnb2k\" (UniqueName: \"kubernetes.io/projected/5f0a7811-6a89-456b-95ea-6c8e698479dd-kube-api-access-mnb2k\") pod \"node-ca-k5kxc\" (UID: \"5f0a7811-6a89-456b-95ea-6c8e698479dd\") " pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.479052 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.512052 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.526498 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.551179 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.566984 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.580750 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.593704 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.609513 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.621939 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.635205 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.671848 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.706240 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.723283 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k5kxc" Feb 17 00:06:05 crc kubenswrapper[4791]: W0217 00:06:05.741723 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f0a7811_6a89_456b_95ea_6c8e698479dd.slice/crio-5bb4ec47033ef1a3dddd5b3aa0a7ad1a44c3a96eb8ffdef9b7f884640f5155a5 WatchSource:0}: Error finding container 5bb4ec47033ef1a3dddd5b3aa0a7ad1a44c3a96eb8ffdef9b7f884640f5155a5: Status 404 returned error can't find the container with id 5bb4ec47033ef1a3dddd5b3aa0a7ad1a44c3a96eb8ffdef9b7f884640f5155a5 Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.755267 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.787235 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.830243 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:05 crc kubenswrapper[4791]: I0217 00:06:05.866271 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.162264 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 18:08:46.098403841 +0000 UTC Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.413688 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.416042 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845"} Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.419332 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee" exitCode=0 Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.419420 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee"} Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.421272 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k5kxc" event={"ID":"5f0a7811-6a89-456b-95ea-6c8e698479dd","Type":"ContainerStarted","Data":"f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9"} Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.421312 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k5kxc" event={"ID":"5f0a7811-6a89-456b-95ea-6c8e698479dd","Type":"ContainerStarted","Data":"5bb4ec47033ef1a3dddd5b3aa0a7ad1a44c3a96eb8ffdef9b7f884640f5155a5"} Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.442996 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.462997 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.491269 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.511537 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.526032 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.539316 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.552652 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.563263 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.587485 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.597631 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.609268 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.620479 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.639096 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.648931 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.662249 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.674458 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.688948 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.698483 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.710857 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.724392 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.735191 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.742965 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.743234 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:10.743192787 +0000 UTC m=+28.222705344 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.751450 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.755168 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.759841 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.784722 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.808097 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.844676 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.844891 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845140 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:10.845098463 +0000 UTC m=+28.324610990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845149 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845169 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845182 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845225 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:10.845211167 +0000 UTC m=+28.324723694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.845035 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.845255 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.845276 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845347 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845356 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845364 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845383 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:10.845377372 +0000 UTC m=+28.324889889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845427 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: E0217 00:06:06.845456 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:10.845451145 +0000 UTC m=+28.324963672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.850291 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.901008 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.924895 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:06 crc kubenswrapper[4791]: I0217 00:06:06.973753 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.012414 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.019515 4791 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.072806 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.104345 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.147257 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.162840 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 22:40:58.886449551 +0000 UTC Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.189879 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.219698 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.219749 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:07 crc kubenswrapper[4791]: E0217 00:06:07.220149 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.219796 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:07 crc kubenswrapper[4791]: E0217 00:06:07.220006 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:07 crc kubenswrapper[4791]: E0217 00:06:07.220412 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.230079 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.265611 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.308408 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.346691 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.387305 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.427359 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08" exitCode=0 Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.427659 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08"} Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.432529 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.468724 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.509332 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.550833 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.589156 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.629332 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.671615 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.706813 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.749860 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.790601 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.826089 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.869024 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.908561 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.949470 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:07 crc kubenswrapper[4791]: I0217 00:06:07.991705 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:07Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.034479 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.070834 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.122329 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.148696 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.163431 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 00:40:40.45421499 +0000 UTC Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.203449 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.228912 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.435235 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90" exitCode=0 Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.435333 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90"} Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.441878 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.465998 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.483613 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.514787 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.531534 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.550427 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.576383 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.589622 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.600170 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.611613 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.625462 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.671990 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.691279 4791 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.693442 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.693490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.693505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.693622 4791 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.704381 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.759881 4791 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.760186 4791 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.761212 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.761246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.761258 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.761275 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.761286 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.771863 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.775219 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.775265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.775276 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.775292 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.775304 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.787616 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.790747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.790789 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.790805 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.790828 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.790844 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.791933 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.810874 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.814033 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.814074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.814086 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.814127 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.814142 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.826702 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.829515 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.830261 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.830291 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.830300 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.830315 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.830326 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.842924 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: E0217 00:06:08.843026 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.844538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.844580 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.844592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.844610 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.844622 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.882922 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:08Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.946758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.946831 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.946851 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.946877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:08 crc kubenswrapper[4791]: I0217 00:06:08.946898 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:08Z","lastTransitionTime":"2026-02-17T00:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.049981 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.050029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.050042 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.050059 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.050070 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.153156 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.153223 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.153242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.153268 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.153288 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.164379 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 16:55:15.747774279 +0000 UTC Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.219463 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.219473 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:09 crc kubenswrapper[4791]: E0217 00:06:09.219610 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.219483 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:09 crc kubenswrapper[4791]: E0217 00:06:09.219667 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:09 crc kubenswrapper[4791]: E0217 00:06:09.219783 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.256324 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.256436 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.256457 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.256484 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.256504 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.359718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.359758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.359769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.359786 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.359798 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.455097 4791 generic.go:334] "Generic (PLEG): container finished" podID="eab5901c-ba92-4f20-9960-ac7cfd67b25a" containerID="a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6" exitCode=0 Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.455352 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerDied","Data":"a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.464583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.464651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.464675 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.464705 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.464728 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.482482 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.508934 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.540168 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.560763 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.567867 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.567915 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.567933 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.567955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.567984 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.582652 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.601032 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.630591 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.645967 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.667616 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.670867 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.670925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.670941 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.670972 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.670991 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.688197 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.716631 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.735971 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.758577 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.773414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.773522 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.773545 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.773572 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.773593 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.778496 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.803779 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.876210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.876256 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.876271 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.876295 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.876313 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.979053 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.979137 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.979158 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.979182 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:09 crc kubenswrapper[4791]: I0217 00:06:09.979199 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:09Z","lastTransitionTime":"2026-02-17T00:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.082186 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.082234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.082254 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.082279 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.082296 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.165405 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 03:26:00.753948679 +0000 UTC Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.185809 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.185863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.185886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.185917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.185939 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.289956 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.290006 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.290022 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.290044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.290064 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.392973 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.393035 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.393051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.393078 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.393098 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.468446 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.469061 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.475450 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" event={"ID":"eab5901c-ba92-4f20-9960-ac7cfd67b25a","Type":"ContainerStarted","Data":"dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.493530 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.496891 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.496929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.496951 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.496977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.496994 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.514527 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.516414 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.534442 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.548878 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.563372 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.581235 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.598401 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.599826 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.599868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.599880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.599898 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.599910 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.612236 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.622756 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.643247 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.655960 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.691848 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.703017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.703084 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.703102 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.703166 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.703231 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.706284 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.725158 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.742675 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.763674 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.784324 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.786930 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.787148 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.787098147 +0000 UTC m=+36.266610714 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.804786 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.813245 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.813309 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.813321 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.813338 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.813350 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.833901 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.857369 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.877750 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.888808 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.888913 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.888955 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.888990 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889066 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889101 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889149 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889181 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889221 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.889199399 +0000 UTC m=+36.368711956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889232 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889263 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.88923525 +0000 UTC m=+36.368747817 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889277 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889341 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889367 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889433 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.889403165 +0000 UTC m=+36.368915732 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:10 crc kubenswrapper[4791]: E0217 00:06:10.889520 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.889475588 +0000 UTC m=+36.368988185 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.899032 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.915879 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.915937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.915955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.915984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.916001 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:10Z","lastTransitionTime":"2026-02-17T00:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.919500 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.934814 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.967977 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.980489 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:10 crc kubenswrapper[4791]: I0217 00:06:10.996756 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.015035 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.018853 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.018903 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.018922 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.018947 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.018964 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.042694 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.057995 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.121863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.121923 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.121940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.121964 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.121983 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.166569 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 14:49:48.121724201 +0000 UTC Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.219433 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.219586 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:11 crc kubenswrapper[4791]: E0217 00:06:11.219635 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.219712 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:11 crc kubenswrapper[4791]: E0217 00:06:11.219886 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:11 crc kubenswrapper[4791]: E0217 00:06:11.220029 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.224485 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.224551 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.224570 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.224592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.224608 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.365384 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.365427 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.365438 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.365455 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.365468 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.468099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.468191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.468207 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.468233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.468254 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.479287 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.479322 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.516351 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.534476 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.546725 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.564899 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.570556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.570591 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.570602 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.570617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.570628 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.586839 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.603969 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.620137 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.652778 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.669252 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.674382 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.674421 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.674439 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.674463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.674482 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.699103 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.717279 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.750046 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.766590 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.776993 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.777043 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.777060 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.777083 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.777101 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.788157 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.797878 4791 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.806327 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.828646 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.880082 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.880169 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.880187 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.880213 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.880230 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.983931 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.983978 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.983996 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.984019 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:11 crc kubenswrapper[4791]: I0217 00:06:11.984036 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:11Z","lastTransitionTime":"2026-02-17T00:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.041452 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.056439 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.074465 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.104584 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.104626 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.104645 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.104668 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.104684 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.126563 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.154937 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.166759 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.166986 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 11:20:56.492612949 +0000 UTC Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.179527 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.189539 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.206891 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.206979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.207161 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.207174 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.207193 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.207204 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.216608 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.229544 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.240852 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.258376 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.272784 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.285340 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.304928 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:12Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.309332 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.309381 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.309393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.309411 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.309423 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.411916 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.411955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.411966 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.411983 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.411994 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.514351 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.514409 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.514426 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.514451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.514467 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.617537 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.617925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.618087 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.618267 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.618391 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.721158 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.721199 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.721207 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.721220 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.721229 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.823210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.823240 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.823249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.823261 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.823270 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.925617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.925655 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.925665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.925680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.925693 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:12Z","lastTransitionTime":"2026-02-17T00:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:12 crc kubenswrapper[4791]: I0217 00:06:12.939885 4791 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.030044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.030153 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.030176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.030204 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.030234 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.133155 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.133202 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.133229 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.133242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.133251 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.167941 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 02:47:14.519648344 +0000 UTC Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.219312 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.219355 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:13 crc kubenswrapper[4791]: E0217 00:06:13.219453 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.219502 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:13 crc kubenswrapper[4791]: E0217 00:06:13.219691 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:13 crc kubenswrapper[4791]: E0217 00:06:13.219866 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.235453 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.235485 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.235496 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.235513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.235525 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.245081 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.258266 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.277950 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.292583 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.305552 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.319372 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.338058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.338162 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.338181 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.338212 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.338234 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.343230 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.359612 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.392746 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.408554 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.430025 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.441375 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.441421 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.441439 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.441455 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.441466 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.444896 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.462669 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.480856 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.486395 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/0.log" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.489896 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa" exitCode=1 Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.489949 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.490533 4791 scope.go:117] "RemoveContainer" containerID="d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.504741 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.522681 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.536580 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.548313 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.548355 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.548368 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.548387 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.548398 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.554878 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.571745 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.587191 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.607281 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.625899 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.647027 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.651946 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.651989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.652006 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.652032 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.652049 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.666356 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.697300 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:13Z\\\",\\\"message\\\":\\\"olicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:13.113082 6064 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 00:06:13.113171 6064 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 00:06:13.113182 6064 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 00:06:13.113204 6064 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:13.113225 6064 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:13.113237 6064 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 00:06:13.113241 6064 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 00:06:13.113253 6064 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 00:06:13.113256 6064 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:13.113261 6064 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:13.113258 6064 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 00:06:13.113268 6064 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:13.113284 6064 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:13.113322 6064 factory.go:656] Stopping watch factory\\\\nI0217 00:06:13.113339 6064 ovnkube.go:599] Stopped ovnkube\\\\nI0217 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.720422 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.752492 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.754580 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.754615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.754630 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.754654 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.754672 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.772617 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.789179 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.811580 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.857285 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.857349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.857367 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.857393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.857412 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.960000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.960040 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.960048 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.960065 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:13 crc kubenswrapper[4791]: I0217 00:06:13.960075 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:13Z","lastTransitionTime":"2026-02-17T00:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.062890 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.062948 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.062960 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.062980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.062994 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.166586 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.166640 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.166653 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.166669 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.166679 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.168985 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 11:11:46.687364073 +0000 UTC Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.269436 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.269501 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.269520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.269547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.269567 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.372956 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.373044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.373065 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.373088 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.373128 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.475800 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.475857 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.475875 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.475899 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.475917 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.496429 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/1.log" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.496978 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/0.log" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.500479 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40" exitCode=1 Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.500515 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.500558 4791 scope.go:117] "RemoveContainer" containerID="d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.502542 4791 scope.go:117] "RemoveContainer" containerID="0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40" Feb 17 00:06:14 crc kubenswrapper[4791]: E0217 00:06:14.502811 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.526384 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.543685 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.563583 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.578905 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.578966 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.578984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.579010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.579030 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.579756 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.596871 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.612979 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.648286 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.664983 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.682099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.682305 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.682331 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.682366 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.682390 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.691269 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.713316 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.743806 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5b2767c44ba6d74dcdd85feb06a26a33340399b11e454c22e85d3a9c26c4eaa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:13Z\\\",\\\"message\\\":\\\"olicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:13.113082 6064 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 00:06:13.113171 6064 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 00:06:13.113182 6064 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 00:06:13.113204 6064 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:13.113225 6064 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:13.113237 6064 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 00:06:13.113241 6064 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 00:06:13.113253 6064 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 00:06:13.113256 6064 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:13.113261 6064 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:13.113258 6064 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 00:06:13.113268 6064 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:13.113284 6064 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:13.113322 6064 factory.go:656] Stopping watch factory\\\\nI0217 00:06:13.113339 6064 ovnkube.go:599] Stopped ovnkube\\\\nI0217 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.761332 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.782523 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.784898 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.784960 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.784979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.785008 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.785027 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.802417 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.825011 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.887536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.887592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.887609 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.887632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.887650 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.991371 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.991469 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.991490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.991513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:14 crc kubenswrapper[4791]: I0217 00:06:14.991531 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:14Z","lastTransitionTime":"2026-02-17T00:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.095155 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.095219 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.095239 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.095265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.095284 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.169815 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 14:47:19.500728797 +0000 UTC Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.198624 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.198690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.198709 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.198734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.198755 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.220036 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:15 crc kubenswrapper[4791]: E0217 00:06:15.220257 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.220283 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.220354 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:15 crc kubenswrapper[4791]: E0217 00:06:15.220532 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:15 crc kubenswrapper[4791]: E0217 00:06:15.220759 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.301749 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.301818 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.301836 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.301859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.301876 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.404681 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.404743 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.404761 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.404786 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.404803 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.506745 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/1.log" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.507216 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.507257 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.507294 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.507318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.507338 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.516026 4791 scope.go:117] "RemoveContainer" containerID="0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40" Feb 17 00:06:15 crc kubenswrapper[4791]: E0217 00:06:15.516357 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.539687 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.558493 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.577757 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.599095 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.610459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.610516 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.610537 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.610560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.610578 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.619150 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.641473 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.664149 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.688464 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.702888 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.715577 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.715650 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.715677 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.715707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.715733 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.735529 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.751296 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.773936 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.787418 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.806588 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.818315 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.818378 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.818390 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.818408 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.818421 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.826910 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:15Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.921071 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.921226 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.921241 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.921258 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:15 crc kubenswrapper[4791]: I0217 00:06:15.921272 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:15Z","lastTransitionTime":"2026-02-17T00:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.025174 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.025549 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.025623 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.025724 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.025794 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.127343 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq"] Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.128605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.128671 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.128691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.128719 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.128740 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.129013 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.132257 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.132474 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.149757 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.165385 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.170769 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 22:40:45.345982918 +0000 UTC Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.186594 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.205878 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.224070 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.231801 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.231865 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.231886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.231914 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.231934 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.243074 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.255940 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.256144 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1832e521-1715-432d-917c-bc0ab725e92f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.256222 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h57s\" (UniqueName: \"kubernetes.io/projected/1832e521-1715-432d-917c-bc0ab725e92f-kube-api-access-9h57s\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.256284 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.260193 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.273816 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.292942 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.310490 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.327181 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.335047 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.335123 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.335138 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.335157 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.335169 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.346628 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.356981 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.357102 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.357261 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1832e521-1715-432d-917c-bc0ab725e92f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.357347 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h57s\" (UniqueName: \"kubernetes.io/projected/1832e521-1715-432d-917c-bc0ab725e92f-kube-api-access-9h57s\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.357909 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.357978 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1832e521-1715-432d-917c-bc0ab725e92f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.364428 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1832e521-1715-432d-917c-bc0ab725e92f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.421179 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h57s\" (UniqueName: \"kubernetes.io/projected/1832e521-1715-432d-917c-bc0ab725e92f-kube-api-access-9h57s\") pod \"ovnkube-control-plane-749d76644c-8tdlq\" (UID: \"1832e521-1715-432d-917c-bc0ab725e92f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.421684 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.432954 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.438304 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.438433 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.438457 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.438543 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.438563 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.442634 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" Feb 17 00:06:16 crc kubenswrapper[4791]: W0217 00:06:16.456997 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1832e521_1715_432d_917c_bc0ab725e92f.slice/crio-6f74ba2385a790a9fdb63bc609b26b66608392c46f9820bfb96ef21b95e157b1 WatchSource:0}: Error finding container 6f74ba2385a790a9fdb63bc609b26b66608392c46f9820bfb96ef21b95e157b1: Status 404 returned error can't find the container with id 6f74ba2385a790a9fdb63bc609b26b66608392c46f9820bfb96ef21b95e157b1 Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.457328 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.468620 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.522202 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" event={"ID":"1832e521-1715-432d-917c-bc0ab725e92f","Type":"ContainerStarted","Data":"6f74ba2385a790a9fdb63bc609b26b66608392c46f9820bfb96ef21b95e157b1"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.540720 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.540782 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.540802 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.540832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.540855 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.643754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.643797 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.643806 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.643821 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.643831 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.747483 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.747656 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.747738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.748599 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.748649 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.852491 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.852594 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.852616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.852676 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.852696 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.955789 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.955841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.955889 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.955913 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:16 crc kubenswrapper[4791]: I0217 00:06:16.955930 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:16Z","lastTransitionTime":"2026-02-17T00:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.057945 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.058000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.058017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.058042 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.058060 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.161162 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.161206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.161223 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.161246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.161265 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.171434 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 08:30:23.286603384 +0000 UTC Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.220166 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.220190 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.220177 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.220349 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.220689 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.220839 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.263794 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.263845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.263859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.263877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.263889 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.366855 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.366886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.366897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.366910 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.366918 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.470403 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.470478 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.470501 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.470527 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.470544 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.530383 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" event={"ID":"1832e521-1715-432d-917c-bc0ab725e92f","Type":"ContainerStarted","Data":"e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.530447 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" event={"ID":"1832e521-1715-432d-917c-bc0ab725e92f","Type":"ContainerStarted","Data":"6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.554740 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.573104 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.573191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.573210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.573234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.573251 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.574736 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.596587 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.617977 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.618997 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6x28n"] Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.619698 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.619799 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.638197 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.675771 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.675828 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.675849 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.675874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.675893 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.686680 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.706168 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.727299 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.751674 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.763339 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.771816 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzq7\" (UniqueName: \"kubernetes.io/projected/1d97cf45-2324-494c-839f-6f264eba3828-kube-api-access-tnzq7\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.771844 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.776067 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.777567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.777609 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.777622 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.777639 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.777652 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.794781 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.816926 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.828974 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.844124 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.858932 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.872894 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzq7\" (UniqueName: \"kubernetes.io/projected/1d97cf45-2324-494c-839f-6f264eba3828-kube-api-access-tnzq7\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.872962 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.873140 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:17 crc kubenswrapper[4791]: E0217 00:06:17.873215 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:18.373196368 +0000 UTC m=+35.852708905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.878072 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.879733 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.879789 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.879818 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.879848 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.879873 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.892473 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.899894 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzq7\" (UniqueName: \"kubernetes.io/projected/1d97cf45-2324-494c-839f-6f264eba3828-kube-api-access-tnzq7\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.923226 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.942440 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.956364 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.970141 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.983409 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.983489 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.983517 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.983549 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.983571 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:17Z","lastTransitionTime":"2026-02-17T00:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.984340 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:17 crc kubenswrapper[4791]: I0217 00:06:17.996423 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.013310 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.027496 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.042674 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.056351 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.075205 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.086860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.086941 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.086965 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.086995 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.087015 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.106605 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.119178 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.136562 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.168232 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.172444 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 19:30:26.139074475 +0000 UTC Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.189636 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.189697 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.189717 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.189741 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.189758 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.293212 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.293282 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.293302 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.293326 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.293344 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.380488 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.380774 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.380900 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:19.380868847 +0000 UTC m=+36.860381404 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.396988 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.397054 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.397071 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.397094 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.397138 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.500547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.500628 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.500654 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.500685 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.500708 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.603950 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.604005 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.604023 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.604048 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.604068 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.707241 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.707305 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.707332 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.707362 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.707383 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.809752 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.809808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.809823 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.809841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.809855 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.884872 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.885149 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:06:34.885067368 +0000 UTC m=+52.364579935 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.913048 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.913151 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.913175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.913204 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.913221 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:18Z","lastTransitionTime":"2026-02-17T00:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.986275 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.986380 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.986419 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:18 crc kubenswrapper[4791]: I0217 00:06:18.986456 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986540 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986577 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986589 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986594 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986604 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986636 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986614 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986715 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986644 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:34.986627543 +0000 UTC m=+52.466140070 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986785 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:34.986761988 +0000 UTC m=+52.466274555 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986807 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:34.986795589 +0000 UTC m=+52.466308146 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:18 crc kubenswrapper[4791]: E0217 00:06:18.986826 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:34.986816429 +0000 UTC m=+52.466328996 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.015924 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.015964 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.015975 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.015992 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.016015 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.118756 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.118827 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.118849 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.118881 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.118904 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.123424 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.123488 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.123511 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.123535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.123555 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.145339 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.150688 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.150758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.150782 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.150815 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.150837 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.172075 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.173188 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 10:24:24.324309054 +0000 UTC Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.177983 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.178068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.178093 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.178159 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.178184 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.207350 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.212858 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.212910 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.212929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.212955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.212973 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.220560 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.220634 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.220664 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.220761 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.220793 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.220919 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.221046 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.221182 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.239062 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.245810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.245846 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.245857 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.245874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.245887 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.263798 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.263950 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.265844 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.265895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.265912 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.265935 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.265952 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.369478 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.369564 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.369587 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.369618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.369642 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.391053 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.391237 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:19 crc kubenswrapper[4791]: E0217 00:06:19.391332 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:21.391305293 +0000 UTC m=+38.870817850 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.472895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.472969 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.472989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.473017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.473036 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.576623 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.576716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.576736 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.576758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.576775 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.680375 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.680452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.680475 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.680501 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.680525 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.784580 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.784641 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.784658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.784682 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.784700 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.887460 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.887520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.887538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.887562 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.887579 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.990885 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.990940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.990957 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.990980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:19 crc kubenswrapper[4791]: I0217 00:06:19.990996 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:19Z","lastTransitionTime":"2026-02-17T00:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.094739 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.094812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.094836 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.094860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.094884 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.173596 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 19:26:47.645229442 +0000 UTC Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.198079 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.198161 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.198179 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.198203 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.198221 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.300733 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.300768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.300777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.300792 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.300803 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.403579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.403665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.403689 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.403716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.403734 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.506918 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.506976 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.507000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.507024 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.507045 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.609783 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.609853 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.609888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.609925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.609947 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.713133 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.713198 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.713222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.713364 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.713411 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.815396 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.815444 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.815461 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.815487 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.815504 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.918674 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.918754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.918779 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.918802 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:20 crc kubenswrapper[4791]: I0217 00:06:20.918819 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:20Z","lastTransitionTime":"2026-02-17T00:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.022031 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.022089 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.022147 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.022181 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.022235 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.124778 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.124805 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.124814 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.124826 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.124834 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.174074 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 09:00:07.473383185 +0000 UTC Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.219780 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.219873 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.219908 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.219991 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.219985 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.220158 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.220460 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.220585 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.227412 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.227467 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.227494 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.227518 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.227536 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.330538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.330605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.330628 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.330663 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.330687 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.411613 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.411776 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:21 crc kubenswrapper[4791]: E0217 00:06:21.411845 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:25.411822393 +0000 UTC m=+42.891334930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.432724 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.432775 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.432790 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.432810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.432826 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.535177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.535220 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.535229 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.535243 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.535253 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.638211 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.638251 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.638263 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.638279 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.638291 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.741448 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.741506 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.741526 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.741555 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.741574 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.844283 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.844348 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.844373 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.844404 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.844427 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.946883 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.946953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.946978 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.947006 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:21 crc kubenswrapper[4791]: I0217 00:06:21.947026 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:21Z","lastTransitionTime":"2026-02-17T00:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.049706 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.049744 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.049752 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.049765 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.049774 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.153272 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.153336 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.153357 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.153387 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.153405 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.175150 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 17:14:42.131943143 +0000 UTC Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.256664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.256725 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.256743 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.256766 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.256787 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.359424 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.359473 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.359490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.359512 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.359529 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.461820 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.461860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.461877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.461897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.461914 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.564810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.564868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.564886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.564908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.564925 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.666567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.666628 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.666645 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.666670 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.666688 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.769738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.769840 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.769863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.769888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.769907 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.874028 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.874143 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.874168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.874198 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.874220 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.977169 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.977228 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.977246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.977270 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:22 crc kubenswrapper[4791]: I0217 00:06:22.977288 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:22Z","lastTransitionTime":"2026-02-17T00:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.080535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.080605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.080625 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.080652 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.080669 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.175537 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 17:08:04.891064426 +0000 UTC Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.183636 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.183697 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.183719 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.183748 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.183771 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.219806 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.219891 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.219942 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.219954 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:23 crc kubenswrapper[4791]: E0217 00:06:23.220188 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:23 crc kubenswrapper[4791]: E0217 00:06:23.220477 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:23 crc kubenswrapper[4791]: E0217 00:06:23.220627 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:23 crc kubenswrapper[4791]: E0217 00:06:23.220825 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.243584 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.265141 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.286359 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.286419 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.286436 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.286459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.286475 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.291056 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.313704 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.339866 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.358096 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.374597 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.389021 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.389141 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.389168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.389199 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.389223 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.392784 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.411277 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.427197 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.467508 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.485022 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.492039 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.492354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.492495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.492634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.492782 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.508642 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.528181 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.561958 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.580361 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.596047 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.596135 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.596154 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.596183 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.596202 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.598023 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.699162 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.699509 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.699619 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.699734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.699850 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.802965 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.803023 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.803045 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.803074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.803095 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.906690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.906810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.906834 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.906861 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:23 crc kubenswrapper[4791]: I0217 00:06:23.906888 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:23Z","lastTransitionTime":"2026-02-17T00:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.010307 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.010386 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.010405 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.010429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.010447 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.113097 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.113144 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.113153 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.113168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.113178 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.176509 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 22:44:06.38731224 +0000 UTC Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.216546 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.216612 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.216631 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.216655 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.216672 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.322766 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.322847 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.322865 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.322894 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.322913 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.426965 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.427004 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.427021 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.427046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.427065 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.529862 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.529925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.529949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.529979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.530002 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.632978 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.633031 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.633051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.633080 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.633135 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.735951 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.736011 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.736029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.736059 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.736077 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.839255 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.839343 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.839365 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.839398 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.839421 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.942235 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.942290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.942310 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.942334 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:24 crc kubenswrapper[4791]: I0217 00:06:24.942350 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:24Z","lastTransitionTime":"2026-02-17T00:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.044800 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.044860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.044876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.044899 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.044918 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.148454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.148548 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.148571 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.148597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.148614 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.177275 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 14:42:04.911961008 +0000 UTC Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.220372 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.220449 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.220594 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.220680 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.220718 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.220891 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.220986 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.221090 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.252101 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.252191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.252209 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.252231 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.252249 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.355792 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.355867 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.355891 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.355920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.355944 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.453069 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.453330 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:25 crc kubenswrapper[4791]: E0217 00:06:25.453452 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:33.453421971 +0000 UTC m=+50.932934528 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.459691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.459745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.459758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.459777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.459791 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.562685 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.562760 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.562783 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.562813 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.562836 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.665681 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.665728 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.665740 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.665755 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.665767 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.769035 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.769094 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.769144 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.769171 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.769192 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.872621 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.872710 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.872737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.872769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.872794 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.976334 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.976394 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.976416 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.976441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:25 crc kubenswrapper[4791]: I0217 00:06:25.976462 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:25Z","lastTransitionTime":"2026-02-17T00:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.079758 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.079798 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.079809 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.079824 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.079834 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.177571 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 22:13:08.525395331 +0000 UTC Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.183383 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.183446 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.183471 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.183500 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.183522 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.286035 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.286150 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.286176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.286208 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.286235 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.389074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.389187 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.389209 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.389234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.389253 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.492805 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.492864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.492882 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.492906 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.492923 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.595894 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.595985 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.596010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.596043 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.596068 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.699591 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.699680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.699704 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.699734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.699767 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.802747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.802907 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.802929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.802953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.802970 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.905962 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.906006 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.906022 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.906045 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:26 crc kubenswrapper[4791]: I0217 00:06:26.906062 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:26Z","lastTransitionTime":"2026-02-17T00:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.010181 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.010239 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.010255 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.010279 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.010296 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.113297 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.113366 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.113391 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.113421 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.113443 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.178214 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 20:35:28.399948158 +0000 UTC Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.216008 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.216031 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.216039 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.216051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.216060 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.219521 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.219555 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.219647 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:27 crc kubenswrapper[4791]: E0217 00:06:27.219751 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.219837 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:27 crc kubenswrapper[4791]: E0217 00:06:27.219941 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:27 crc kubenswrapper[4791]: E0217 00:06:27.220066 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:27 crc kubenswrapper[4791]: E0217 00:06:27.220186 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.318808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.318907 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.318934 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.318969 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.318993 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.422463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.422519 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.422579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.422606 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.422622 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.525828 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.525920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.525937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.525960 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.525977 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.628265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.628301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.628309 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.628321 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.628332 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.731453 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.731519 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.731536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.731560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.731578 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.834829 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.834890 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.834908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.834932 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.834948 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.937592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.937709 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.937734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.937762 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:27 crc kubenswrapper[4791]: I0217 00:06:27.937784 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:27Z","lastTransitionTime":"2026-02-17T00:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.041193 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.041234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.041244 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.041258 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.041267 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.144190 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.144258 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.144277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.144300 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.144319 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.179185 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 01:08:52.722765387 +0000 UTC Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.247373 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.247431 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.247458 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.247487 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.247505 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.350945 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.351007 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.351027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.351050 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.351069 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.454469 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.454520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.454535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.454556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.454576 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.558230 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.558306 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.558331 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.558357 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.558421 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.661237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.661309 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.661327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.661353 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.661373 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.763840 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.763885 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.763893 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.763919 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.763929 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.867248 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.867301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.867310 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.867325 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.867336 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.970942 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.971009 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.971026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.971051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:28 crc kubenswrapper[4791]: I0217 00:06:28.971070 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:28Z","lastTransitionTime":"2026-02-17T00:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.073920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.074151 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.074181 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.074214 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.074235 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.177893 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.177952 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.177971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.177997 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.178015 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.180283 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 19:05:19.508331486 +0000 UTC Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.219598 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.219773 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.219872 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.219884 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.219926 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.220618 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.220697 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.220811 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.220997 4791 scope.go:117] "RemoveContainer" containerID="0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.283874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.284231 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.284248 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.284269 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.284286 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.386974 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.387053 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.387076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.387142 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.387169 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.477363 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.477412 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.477430 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.477452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.477470 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.498902 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.504736 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.504792 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.504816 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.504845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.504862 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.518695 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.523256 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.523300 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.523312 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.523330 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.523344 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.541338 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.546466 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.546504 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.546515 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.546532 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.546544 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.564428 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.569507 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.569544 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.569556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.569570 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.569582 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.574927 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/1.log" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.577980 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.578407 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.589723 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: E0217 00:06:29.590098 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.592673 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.592737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.592755 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.592780 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.592798 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.598203 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.621528 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.650507 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.671618 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.696424 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.696491 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.696513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.696542 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.696584 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.698905 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.718024 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.734240 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.756373 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.774200 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.800137 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.800193 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.800212 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.800237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.800256 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.801587 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.828711 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.856729 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.870906 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.886392 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.902277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.902311 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.902320 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.902333 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.902342 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:29Z","lastTransitionTime":"2026-02-17T00:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.911514 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.920341 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:29 crc kubenswrapper[4791]: I0217 00:06:29.933726 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.004957 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.004991 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.005002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.005017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.005027 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.107021 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.107067 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.107075 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.107089 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.107097 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.180853 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:37:05.289376691 +0000 UTC Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.209864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.209923 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.209941 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.209966 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.209983 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.312459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.312493 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.312504 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.312520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.312532 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.414898 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.414961 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.414979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.415004 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.415022 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.518670 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.518734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.518754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.518777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.518795 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.585839 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/2.log" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.586877 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/1.log" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.591641 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" exitCode=1 Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.591709 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.591768 4791 scope.go:117] "RemoveContainer" containerID="0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.592907 4791 scope.go:117] "RemoveContainer" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" Feb 17 00:06:30 crc kubenswrapper[4791]: E0217 00:06:30.593196 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.607784 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.621928 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.623480 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.623557 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.623583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.623622 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.623649 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.648656 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.661596 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.704279 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.726590 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.726667 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.726692 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.726716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.726731 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.734099 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.755579 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0447c886ed6f06f1f58963535c53b2de3e27e1cbb4b9210cd507691736736f40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:14Z\\\",\\\"message\\\":\\\"ndler 5 for removal\\\\nI0217 00:06:14.432585 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:14.432616 6188 factory.go:656] Stopping watch factory\\\\nI0217 00:06:14.432659 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:14.432596 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:14.432725 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:14.432697 6188 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.432953 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:14.433009 6188 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433215 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:14.433466 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.771482 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.788744 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.806964 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.824479 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.829612 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.829692 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.829718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.829751 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.829778 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.841549 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.864173 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.885865 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.904805 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.924306 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.932583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.933229 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.933568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.933744 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.933926 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:30Z","lastTransitionTime":"2026-02-17T00:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:30 crc kubenswrapper[4791]: I0217 00:06:30.943818 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.037206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.037629 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.037694 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.037768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.037848 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.140237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.140282 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.140297 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.140316 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.140329 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.181205 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 18:38:58.719247442 +0000 UTC Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.219388 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.219498 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.219517 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.219625 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:31 crc kubenswrapper[4791]: E0217 00:06:31.219619 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:31 crc kubenswrapper[4791]: E0217 00:06:31.219761 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:31 crc kubenswrapper[4791]: E0217 00:06:31.219941 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:31 crc kubenswrapper[4791]: E0217 00:06:31.220176 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.244518 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.244592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.244615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.244641 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.244660 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.347556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.347613 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.347627 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.347644 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.347657 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.450760 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.450809 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.450819 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.450835 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.450846 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.554454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.554528 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.554546 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.554573 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.554590 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.597249 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/2.log" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.600190 4791 scope.go:117] "RemoveContainer" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" Feb 17 00:06:31 crc kubenswrapper[4791]: E0217 00:06:31.600346 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.642681 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.656463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.656514 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.656524 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.656540 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.656552 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.658095 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.668203 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.678000 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.688706 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.697358 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.705936 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.717140 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.727542 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.738699 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.754189 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.758989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.759023 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.759034 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.759048 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.759062 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.776178 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.786288 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.802206 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.834000 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.848503 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.860404 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:31Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.861702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.861768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.861788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.861812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.861829 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.964597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.964650 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.964664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.964681 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:31 crc kubenswrapper[4791]: I0217 00:06:31.964697 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:31Z","lastTransitionTime":"2026-02-17T00:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.067701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.067776 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.067799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.067829 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.067852 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.171617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.171690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.171716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.171745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.171767 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.182432 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 13:23:34.422248985 +0000 UTC Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.275090 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.275179 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.275202 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.275233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.275257 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.377484 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.377559 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.377579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.377606 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.377625 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.480020 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.480086 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.480104 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.480168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.480186 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.583798 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.583861 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.583883 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.583911 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.583931 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.686842 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.686901 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.686921 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.686945 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.686967 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.789287 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.789354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.789372 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.789398 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.789416 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.893003 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.893073 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.893097 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.893156 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:32 crc kubenswrapper[4791]: I0217 00:06:32.893181 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:32Z","lastTransitionTime":"2026-02-17T00:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.000466 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.000605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.000632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.000662 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.000682 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.095396 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.103734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.103841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.103992 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.104023 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.104046 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.108255 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.132050 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.148212 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.170475 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.183191 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 10:08:28.943175902 +0000 UTC Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.190680 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.207568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.207622 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.207645 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.207669 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.207687 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.220427 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.220527 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.220660 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.220698 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.220806 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.220947 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.221055 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.221198 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.224094 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.240069 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.255759 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.275601 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.294286 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.310747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.310824 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.310842 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.310869 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.310888 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.318848 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.344959 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.365832 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.384166 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.403272 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.414246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.414288 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.414301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.414318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.414330 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.422703 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.439215 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.455737 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.472275 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.493300 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.513598 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.516880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.516924 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.516942 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.516966 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.516983 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.532602 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.540174 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.540339 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:33 crc kubenswrapper[4791]: E0217 00:06:33.540435 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:06:49.540408103 +0000 UTC m=+67.019920660 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.550991 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.574530 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.591783 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.608531 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.619917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.619977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.619999 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.620024 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.620043 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.638868 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.654314 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.674154 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.693061 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.723267 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.723362 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.723378 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.723402 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.723421 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.725154 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.742641 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.765185 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.788394 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.808582 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.826619 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.826691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.826718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.826750 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.826777 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.833773 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.929605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.929674 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.929690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.929716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:33 crc kubenswrapper[4791]: I0217 00:06:33.929736 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:33Z","lastTransitionTime":"2026-02-17T00:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.032895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.033015 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.033044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.033077 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.033100 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.136766 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.136825 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.136843 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.136866 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.136883 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.184374 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 00:48:00.836081502 +0000 UTC Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.240329 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.240386 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.240403 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.240428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.240446 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.343418 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.343498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.343522 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.343553 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.343578 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.446274 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.446311 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.446323 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.446340 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.446354 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.548877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.548944 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.548971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.549001 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.549037 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.652064 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.652147 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.652167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.652191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.652209 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.755663 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.755895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.755913 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.755938 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.755955 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.858499 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.858549 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.858565 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.858588 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.858605 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.956954 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:06:34 crc kubenswrapper[4791]: E0217 00:06:34.957198 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:07:06.957158339 +0000 UTC m=+84.436670906 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.961000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.961058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.961076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.961101 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:34 crc kubenswrapper[4791]: I0217 00:06:34.961147 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:34Z","lastTransitionTime":"2026-02-17T00:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.058244 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.058319 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.058366 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.058440 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058454 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058496 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058493 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058516 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058589 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:07:07.058560979 +0000 UTC m=+84.538073536 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058615 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:07:07.058603481 +0000 UTC m=+84.538116048 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058680 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058699 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058825 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:07:07.058796677 +0000 UTC m=+84.538309244 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058717 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058872 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.058942 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:07:07.058926981 +0000 UTC m=+84.538439548 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.064608 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.064655 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.064672 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.064701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.064721 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.167532 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.167600 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.167617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.167643 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.167664 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.185049 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 02:14:40.315048597 +0000 UTC Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.219805 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.219842 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.219911 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.220079 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.220140 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.220263 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.220366 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:35 crc kubenswrapper[4791]: E0217 00:06:35.220657 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.274535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.274984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.275318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.275536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.275760 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.379830 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.379901 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.379922 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.379951 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.379974 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.483740 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.483815 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.483837 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.483862 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.483879 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.586667 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.586946 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.587073 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.587236 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.587390 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.691017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.691085 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.691104 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.691159 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.691178 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.794428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.794487 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.794505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.794531 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.794550 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.898751 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.898835 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.898858 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.898888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:35 crc kubenswrapper[4791]: I0217 00:06:35.898911 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:35Z","lastTransitionTime":"2026-02-17T00:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.002326 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.002379 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.002397 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.002420 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.002437 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.106573 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.106631 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.106649 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.106672 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.106695 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.185638 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 11:57:31.235483287 +0000 UTC Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.210452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.210505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.210522 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.210547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.210565 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.314785 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.314852 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.314871 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.314896 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.314913 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.417102 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.417178 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.417193 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.417211 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.417222 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.519940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.520020 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.520044 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.520074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.520096 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.622353 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.622413 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.622434 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.622458 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.622476 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.725890 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.725959 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.725976 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.726002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.726015 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.828085 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.828165 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.828179 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.828197 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.828209 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.931168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.931222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.931233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.931249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:36 crc kubenswrapper[4791]: I0217 00:06:36.931262 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:36Z","lastTransitionTime":"2026-02-17T00:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.035795 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.035868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.035895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.035924 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.035947 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.139550 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.139627 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.139651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.139680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.139702 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.186286 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 13:01:18.783157753 +0000 UTC Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.220205 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.220205 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:37 crc kubenswrapper[4791]: E0217 00:06:37.220449 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.220502 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.220257 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:37 crc kubenswrapper[4791]: E0217 00:06:37.220578 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:37 crc kubenswrapper[4791]: E0217 00:06:37.220706 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:37 crc kubenswrapper[4791]: E0217 00:06:37.220852 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.242579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.242663 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.242712 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.242733 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.242750 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.346161 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.346253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.346282 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.346309 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.346327 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.449156 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.449233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.449260 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.449290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.449313 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.552856 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.552950 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.552977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.553012 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.553037 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.655888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.655944 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.655961 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.655983 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.656000 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.761981 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.762145 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.762168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.762184 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.762196 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.864822 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.864876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.864895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.864917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.864933 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.967532 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.967616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.967646 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.967675 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:37 crc kubenswrapper[4791]: I0217 00:06:37.967695 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:37Z","lastTransitionTime":"2026-02-17T00:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.069937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.069994 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.070009 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.070033 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.070050 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.173457 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.173525 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.173542 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.173568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.173588 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.187322 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 20:11:42.03298054 +0000 UTC Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.276707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.276762 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.276779 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.276801 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.276817 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.380509 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.380565 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.380583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.380605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.380623 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.483433 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.483505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.483541 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.483574 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.483595 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.585782 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.585968 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.586027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.586053 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.586070 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.689336 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.689411 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.689429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.689454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.689471 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.793509 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.793553 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.793570 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.793594 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.793611 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.896194 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.896264 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.896282 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.896306 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.896323 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.998714 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.998765 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.998780 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.998798 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:38 crc kubenswrapper[4791]: I0217 00:06:38.998817 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:38Z","lastTransitionTime":"2026-02-17T00:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.100938 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.101007 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.101024 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.101041 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.101053 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.188176 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 14:53:39.640429289 +0000 UTC Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.204064 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.204226 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.204253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.204286 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.204309 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.219625 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.219647 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.219749 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.219792 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.220019 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.220851 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.220994 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.220565 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.306877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.306937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.306953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.306979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.306998 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.410007 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.410232 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.410315 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.410404 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.410435 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.513828 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.513912 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.513931 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.513953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.514002 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.617816 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.617928 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.617979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.618004 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.618048 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.721451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.721516 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.721539 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.721565 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.721585 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.824932 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.825014 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.825037 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.825068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.825091 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.853987 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.854051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.854069 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.854093 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.854155 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.871340 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.876984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.877035 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.877052 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.877076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.877093 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.906043 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.912710 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.912765 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.912784 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.912808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.912826 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.936982 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.942348 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.942413 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.942426 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.942452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.942468 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.965624 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.970184 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.970242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.970262 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.970287 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.970308 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.991920 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:39 crc kubenswrapper[4791]: E0217 00:06:39.992167 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.994375 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.994451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.994478 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.994511 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:39 crc kubenswrapper[4791]: I0217 00:06:39.994536 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:39Z","lastTransitionTime":"2026-02-17T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.097743 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.097799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.097819 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.097845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.097863 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.188967 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 01:02:42.630809025 +0000 UTC Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.200662 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.200725 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.200744 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.200767 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.200785 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.304058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.304156 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.304176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.304208 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.304229 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.407188 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.407261 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.407285 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.407316 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.407339 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.510918 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.510986 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.511010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.511046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.511067 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.614972 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.615036 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.615053 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.615078 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.615095 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.719103 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.719212 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.719250 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.719283 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.719307 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.822632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.822762 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.822792 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.822824 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.822847 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.926497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.926576 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.926601 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.926636 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:40 crc kubenswrapper[4791]: I0217 00:06:40.926660 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:40Z","lastTransitionTime":"2026-02-17T00:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.030244 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.030322 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.030342 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.030375 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.030399 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.134158 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.134208 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.134219 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.134238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.134251 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.189905 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 01:30:41.186683997 +0000 UTC Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.219523 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.219582 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:41 crc kubenswrapper[4791]: E0217 00:06:41.219728 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.219764 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.219864 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:41 crc kubenswrapper[4791]: E0217 00:06:41.220012 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:41 crc kubenswrapper[4791]: E0217 00:06:41.220182 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:41 crc kubenswrapper[4791]: E0217 00:06:41.220350 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.236616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.236666 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.236683 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.236706 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.236729 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.340058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.340134 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.340153 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.340176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.340194 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.443327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.443377 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.443396 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.443420 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.443437 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.546000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.546062 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.546079 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.546143 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.546162 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.684598 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.684665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.684680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.684700 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.684716 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.787576 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.787642 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.787659 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.787687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.787705 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.894198 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.894304 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.894324 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.894349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.894366 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.997914 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.997982 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.997998 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.998023 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:41 crc kubenswrapper[4791]: I0217 00:06:41.998040 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:41Z","lastTransitionTime":"2026-02-17T00:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.100915 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.100945 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.100954 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.100970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.100980 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.190479 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 17:33:39.799139837 +0000 UTC Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.204176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.204252 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.204302 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.204334 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.204358 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.306908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.306981 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.307007 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.307037 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.307062 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.410516 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.410598 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.410628 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.410660 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.410684 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.513646 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.513704 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.513721 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.513747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.513764 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.617593 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.617687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.617714 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.617748 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.617773 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.720808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.720865 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.720882 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.720905 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.720922 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.823692 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.823757 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.823774 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.823799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.823817 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.926325 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.926395 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.926414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.926444 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:42 crc kubenswrapper[4791]: I0217 00:06:42.926463 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:42Z","lastTransitionTime":"2026-02-17T00:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.029268 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.029320 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.029336 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.029361 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.029378 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.132443 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.132490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.132507 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.132530 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.132548 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.190915 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 23:06:26.454486286 +0000 UTC Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.219461 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.219512 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.219480 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.219624 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:43 crc kubenswrapper[4791]: E0217 00:06:43.219817 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:43 crc kubenswrapper[4791]: E0217 00:06:43.222702 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:43 crc kubenswrapper[4791]: E0217 00:06:43.224281 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:43 crc kubenswrapper[4791]: E0217 00:06:43.224370 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.225511 4791 scope.go:117] "RemoveContainer" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" Feb 17 00:06:43 crc kubenswrapper[4791]: E0217 00:06:43.225848 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.235785 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.235839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.235858 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.235883 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.235901 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.242174 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.265434 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.284192 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.308724 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.330449 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.339892 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.339934 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.339951 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.339977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.339995 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.354313 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.368895 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.385391 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.406134 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.420191 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.434282 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.442793 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.442845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.442868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.442900 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.442922 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.467475 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.483618 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.504574 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.523763 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.546003 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.546050 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.546066 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.546090 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.546193 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.555096 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.570919 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.586843 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.649073 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.649143 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.649159 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.649191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.649207 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.752764 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.752842 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.752865 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.752897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.752922 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.856358 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.856414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.856432 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.856457 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.856474 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.959196 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.959257 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.959280 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.959308 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:43 crc kubenswrapper[4791]: I0217 00:06:43.959328 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:43Z","lastTransitionTime":"2026-02-17T00:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.062092 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.062186 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.062211 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.062240 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.062260 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.165246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.165306 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.165330 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.165358 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.165380 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.192011 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 09:21:50.657359513 +0000 UTC Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.268018 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.268081 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.268103 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.268174 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.268196 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.370723 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.370797 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.370822 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.370852 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.370874 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.474177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.474247 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.474275 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.474304 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.474324 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.577531 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.577588 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.577605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.577629 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.577647 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.680614 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.680668 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.680687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.680711 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.680729 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.783895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.783961 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.783979 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.784004 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.784024 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.887175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.887232 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.887249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.887271 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.887288 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.989800 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.989863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.989880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.989905 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:44 crc kubenswrapper[4791]: I0217 00:06:44.989925 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:44Z","lastTransitionTime":"2026-02-17T00:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.093443 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.093498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.093515 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.093539 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.093555 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.192907 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 14:38:29.412049599 +0000 UTC Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.196498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.196569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.196596 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.196627 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.196647 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.220141 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.220221 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.220247 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:45 crc kubenswrapper[4791]: E0217 00:06:45.220403 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.220493 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:45 crc kubenswrapper[4791]: E0217 00:06:45.220680 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:45 crc kubenswrapper[4791]: E0217 00:06:45.220774 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:45 crc kubenswrapper[4791]: E0217 00:06:45.220883 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.300277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.300327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.300354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.300381 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.300401 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.402569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.402620 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.402637 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.402658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.402676 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.505528 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.505591 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.505611 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.505635 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.505652 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.608537 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.608597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.608618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.608642 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.608660 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.710855 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.710929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.710952 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.710982 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.711000 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.813402 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.813462 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.813495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.813532 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.813556 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.915935 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.916021 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.916050 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.916088 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:45 crc kubenswrapper[4791]: I0217 00:06:45.916143 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:45Z","lastTransitionTime":"2026-02-17T00:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.018468 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.018528 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.018547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.018572 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.018590 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.120441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.120492 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.120508 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.120525 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.120539 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.193229 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 06:48:58.722898515 +0000 UTC Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.229451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.229554 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.229577 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.230072 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.230354 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.333978 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.334029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.334046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.334067 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.335202 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.438318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.438394 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.438419 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.438452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.438477 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.541971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.542028 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.542046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.542070 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.542087 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.644657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.644726 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.644745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.644769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.644786 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.747995 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.748068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.748086 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.748218 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.748250 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.850747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.850808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.850832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.850861 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.850883 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.953806 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.953869 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.953889 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.953913 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:46 crc kubenswrapper[4791]: I0217 00:06:46.953930 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:46Z","lastTransitionTime":"2026-02-17T00:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.056551 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.056615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.056636 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.056660 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.056676 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.160582 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.160651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.160664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.160679 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.161280 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.193512 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 01:06:18.836079387 +0000 UTC Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.219994 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.220033 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.220058 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:47 crc kubenswrapper[4791]: E0217 00:06:47.220206 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:47 crc kubenswrapper[4791]: E0217 00:06:47.220333 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.220382 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:47 crc kubenswrapper[4791]: E0217 00:06:47.220578 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:47 crc kubenswrapper[4791]: E0217 00:06:47.220671 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.264598 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.264644 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.264661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.264684 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.264700 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.368246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.368297 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.368321 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.368349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.368369 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.470657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.470698 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.470707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.470723 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.470732 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.574516 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.574600 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.574624 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.574658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.574685 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.677472 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.677510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.677521 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.677535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.677545 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.780186 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.780219 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.780228 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.780242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.780253 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.882490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.882527 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.882537 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.882555 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.882566 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.984593 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.984631 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.984643 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.984659 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:47 crc kubenswrapper[4791]: I0217 00:06:47.984671 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:47Z","lastTransitionTime":"2026-02-17T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.087218 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.087244 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.087252 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.087266 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.087275 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.188743 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.188770 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.188778 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.188789 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.188797 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.194092 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 03:00:06.599822894 +0000 UTC Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.292065 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.292172 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.292191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.292249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.292270 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.395517 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.395574 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.395591 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.395615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.395632 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.499468 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.499517 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.499529 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.499546 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.499557 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.602164 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.602253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.602271 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.602294 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.602311 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.704354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.704402 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.704414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.704432 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.704444 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.806659 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.806702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.806714 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.806732 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.806745 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.908869 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.908923 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.908940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.908963 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:48 crc kubenswrapper[4791]: I0217 00:06:48.908980 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:48Z","lastTransitionTime":"2026-02-17T00:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.011839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.011880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.011891 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.011908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.011920 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.113897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.113920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.113928 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.113941 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.113950 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.195151 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 11:59:12.358390211 +0000 UTC Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.215910 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.215933 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.215941 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.215953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.215963 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.220309 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.220339 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.220427 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.220387 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.220387 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.220563 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.220618 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.220726 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.318505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.318543 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.318553 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.318569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.318580 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.420784 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.420821 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.420833 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.420851 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.420863 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.522623 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.522658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.522665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.522678 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.522687 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.612983 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.613154 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:49 crc kubenswrapper[4791]: E0217 00:06:49.613203 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:07:21.613190969 +0000 UTC m=+99.092703496 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.624596 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.624627 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.624636 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.624650 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.624659 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.726603 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.726665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.726688 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.726713 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.726732 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.829788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.829844 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.829857 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.829878 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.829889 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.932063 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.932118 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.932128 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.932144 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:49 crc kubenswrapper[4791]: I0217 00:06:49.932154 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:49Z","lastTransitionTime":"2026-02-17T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.035118 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.035183 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.035196 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.035238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.035253 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.137800 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.137849 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.137860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.137876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.137887 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.196221 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 10:30:55.297314149 +0000 UTC Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.239556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.239583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.239594 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.239606 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.239615 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.254955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.254989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.255000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.255017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.255029 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.270580 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.274452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.274618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.274637 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.274662 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.274678 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.286155 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.289538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.289569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.289578 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.289592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.289602 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.304211 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.307431 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.307480 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.307495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.307514 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.307527 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.320219 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.323647 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.323688 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.323704 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.323719 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.323728 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.336599 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: E0217 00:06:50.336703 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.342087 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.342144 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.342156 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.342176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.342190 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.444790 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.444824 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.444833 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.444848 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.444856 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.547396 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.547442 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.547477 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.547494 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.547506 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.649660 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.649701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.649712 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.649727 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.649739 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.739819 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/0.log" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.739862 4791 generic.go:334] "Generic (PLEG): container finished" podID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" containerID="de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7" exitCode=1 Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.739887 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerDied","Data":"de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.740234 4791 scope.go:117] "RemoveContainer" containerID="de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.751547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.751721 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.751729 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.751741 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.751750 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.759969 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.779452 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.792178 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.804008 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.814137 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.822646 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.831842 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.851314 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.854015 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.854043 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.854053 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.854068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.854081 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.862042 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.872759 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.903167 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.937766 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.948674 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.955623 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.955648 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.955656 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.955668 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.955676 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:50Z","lastTransitionTime":"2026-02-17T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.960183 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.970170 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:50 crc kubenswrapper[4791]: I0217 00:06:50.983278 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.000381 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.018915 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.058498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.058544 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.058560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.058583 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.058600 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.161597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.161659 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.161681 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.161707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.161728 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.196493 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 00:55:10.920588291 +0000 UTC Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.219914 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.219991 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.220057 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.220084 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:51 crc kubenswrapper[4791]: E0217 00:06:51.220078 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:51 crc kubenswrapper[4791]: E0217 00:06:51.220220 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:51 crc kubenswrapper[4791]: E0217 00:06:51.220262 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:51 crc kubenswrapper[4791]: E0217 00:06:51.220306 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.263847 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.263930 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.263953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.263984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.264008 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.365753 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.365788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.365797 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.365810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.365820 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.467556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.467592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.467604 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.467620 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.467632 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.569819 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.569851 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.569859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.569874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.569883 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.672030 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.672065 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.672076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.672091 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.672102 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.744366 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/0.log" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.744417 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerStarted","Data":"6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.773579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.773605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.773616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.773630 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.773642 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.785199 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.802163 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.821341 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.833090 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.842379 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.859595 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.877004 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.878454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.878473 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.878480 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.878494 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.878503 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.886435 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.895099 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.928295 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.943050 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.958580 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.973941 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.981263 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.981298 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.981309 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.981327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.981338 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:51Z","lastTransitionTime":"2026-02-17T00:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:51 crc kubenswrapper[4791]: I0217 00:06:51.997215 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.009950 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:52Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.021876 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:52Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.033319 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:52Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.046281 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:52Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.083749 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.083778 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.083791 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.083808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.083819 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.185710 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.185739 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.185752 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.185767 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.185778 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.197182 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 04:00:32.036674482 +0000 UTC Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.288189 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.288226 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.288237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.288253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.288266 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.390432 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.390486 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.390503 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.390524 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.390542 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.492768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.492807 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.492819 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.492835 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.492846 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.594629 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.594661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.594670 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.594684 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.594693 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.697031 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.697099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.697234 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.697265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.697288 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.799382 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.799429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.799438 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.799451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.799460 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.901892 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.901949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.901970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.901998 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:52 crc kubenswrapper[4791]: I0217 00:06:52.902018 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:52Z","lastTransitionTime":"2026-02-17T00:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.004657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.004702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.004717 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.004732 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.004742 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.109959 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.110022 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.110032 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.110052 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.110064 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.198194 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 17:57:49.76624103 +0000 UTC Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.211731 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.211763 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.211772 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.211787 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.211797 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.219362 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.219457 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.219510 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:53 crc kubenswrapper[4791]: E0217 00:06:53.219593 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.219767 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:53 crc kubenswrapper[4791]: E0217 00:06:53.219840 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:53 crc kubenswrapper[4791]: E0217 00:06:53.220014 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:53 crc kubenswrapper[4791]: E0217 00:06:53.220202 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.229821 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.239867 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.254485 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.270092 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.295373 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.310746 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.314489 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.314549 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.314566 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.314597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.314615 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.323542 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.340329 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.360752 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.375272 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.395379 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.412553 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.416812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.416846 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.416861 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.416880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.416896 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.432621 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.445647 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.459256 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.476499 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.494971 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.511262 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.520423 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.520478 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.520497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.520518 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.520533 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.524153 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.623366 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.623428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.623451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.623497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.623523 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.726134 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.726167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.726176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.726189 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.726198 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.828664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.828707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.828716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.828742 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.828752 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.931075 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.931167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.931187 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.931211 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:53 crc kubenswrapper[4791]: I0217 00:06:53.931229 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:53Z","lastTransitionTime":"2026-02-17T00:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.033697 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.033745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.033764 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.033788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.033806 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.136045 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.136090 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.136112 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.136134 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.136178 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.227761 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 12:32:53.277012515 +0000 UTC Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.229018 4791 scope.go:117] "RemoveContainer" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.238454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.238498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.238517 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.238538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.238555 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.341514 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.341567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.341585 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.341611 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.341627 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.444127 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.444223 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.444247 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.444277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.444300 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.546569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.546613 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.546625 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.546643 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.546654 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.648741 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.648813 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.648839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.648872 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.648894 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.750763 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.750795 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.750804 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.750817 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.750825 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.754488 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/2.log" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.757325 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.757734 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.771829 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.781270 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.790848 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.801832 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.811943 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.824167 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.838390 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.851273 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.852785 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.852837 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.852846 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.852862 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.852871 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.869297 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.883993 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.900993 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.910665 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.922252 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.942995 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.952329 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.954738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.954763 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.954771 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.954784 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.954794 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:54Z","lastTransitionTime":"2026-02-17T00:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.963815 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.973985 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.983459 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:54 crc kubenswrapper[4791]: I0217 00:06:54.996300 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.056712 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.056746 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.056755 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.056768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.056777 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.159670 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.159712 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.159722 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.159737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.159745 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.219582 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.219655 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:55 crc kubenswrapper[4791]: E0217 00:06:55.219695 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:55 crc kubenswrapper[4791]: E0217 00:06:55.219803 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.219907 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:55 crc kubenswrapper[4791]: E0217 00:06:55.220075 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.220147 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:55 crc kubenswrapper[4791]: E0217 00:06:55.220294 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.228341 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 01:08:03.36761497 +0000 UTC Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.262934 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.262986 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.263002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.263025 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.263043 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.366298 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.366384 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.366406 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.366435 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.366459 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.470552 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.470623 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.470642 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.470667 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.470684 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.573547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.573633 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.573657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.573691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.573713 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.676001 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.676071 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.676089 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.676153 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.676181 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.761810 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/3.log" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.762769 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/2.log" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.766733 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" exitCode=1 Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.766791 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.766850 4791 scope.go:117] "RemoveContainer" containerID="1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.767783 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:06:55 crc kubenswrapper[4791]: E0217 00:06:55.768014 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.778552 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.778625 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.778639 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.778658 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.778670 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.783620 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.811626 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.824154 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.841934 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.859601 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880479 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880660 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880765 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880852 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880926 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.880787 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a88821dd4511fed922e9ef54ac1a4e92ea05283988c9906656732bb27739fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:30Z\\\",\\\"message\\\":\\\"0] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 00:06:30.321001 6400 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321058 6400 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 00:06:30.321081 6400 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 00:06:30.321088 6400 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 00:06:30.321144 6400 factory.go:656] Stopping watch factory\\\\nI0217 00:06:30.321180 6400 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 00:06:30.321189 6400 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 00:06:30.321197 6400 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 00:06:30.321204 6400 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 00:06:30.321211 6400 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 00:06:30.321226 6400 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 00:06:30.321225 6400 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:30.321498 6400 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:55Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.182069 6791 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.182037 6791 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.184381 6791 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184540 6791 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184971 6791 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185048 6791 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185070 6791 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.897061 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.912711 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.929579 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.949214 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.972307 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.983371 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.983423 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.983441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.983465 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.983483 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:55Z","lastTransitionTime":"2026-02-17T00:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:55 crc kubenswrapper[4791]: I0217 00:06:55.992946 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:55Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.005396 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.023950 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.036096 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.045752 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.055958 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.069604 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.079065 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.086560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.086599 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.086607 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.086622 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.086632 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.190307 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.190389 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.190414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.190449 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.190476 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.229468 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 08:01:42.343208773 +0000 UTC Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.293039 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.293097 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.293146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.293171 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.293189 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.400667 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.400754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.400777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.400805 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.400827 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.503274 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.503312 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.503323 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.503338 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.503350 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.606147 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.606177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.606190 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.606206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.606218 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.708450 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.708497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.708513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.708534 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.708551 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.770822 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/3.log" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.774974 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:06:56 crc kubenswrapper[4791]: E0217 00:06:56.775240 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.792565 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.808517 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.811520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.811552 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.811562 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.811578 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.811589 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.819670 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.835223 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.850358 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.862449 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.872155 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.882953 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.892028 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.902217 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.911598 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.913732 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.913775 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.913787 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.913803 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.913812 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:56Z","lastTransitionTime":"2026-02-17T00:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.927215 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:55Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.182069 6791 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.182037 6791 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.184381 6791 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184540 6791 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184971 6791 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185048 6791 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185070 6791 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.936404 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.945902 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.966743 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.979464 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:56 crc kubenswrapper[4791]: I0217 00:06:56.993351 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:56Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.008146 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:57Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.015991 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.016017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.016029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.016062 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.016073 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.018315 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:06:57Z is after 2025-08-24T17:21:41Z" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.118646 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.118682 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.118690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.118705 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.118714 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.220230 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.220365 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.220264 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.220241 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:57 crc kubenswrapper[4791]: E0217 00:06:57.220537 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:57 crc kubenswrapper[4791]: E0217 00:06:57.220685 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:57 crc kubenswrapper[4791]: E0217 00:06:57.220855 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:57 crc kubenswrapper[4791]: E0217 00:06:57.220977 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.222073 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.222098 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.222109 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.222120 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.222140 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.230461 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 13:33:55.453019621 +0000 UTC Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.325290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.325433 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.325460 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.325492 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.325517 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.428608 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.428669 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.428687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.428710 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.428730 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.532661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.532702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.532715 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.532772 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.532785 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.635719 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.635791 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.635808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.635832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.635856 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.738995 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.739084 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.739108 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.739160 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.739178 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.842545 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.842615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.842638 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.842669 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.842690 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.946567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.946615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.946637 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.946665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:57 crc kubenswrapper[4791]: I0217 00:06:57.946687 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:57Z","lastTransitionTime":"2026-02-17T00:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.049323 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.049367 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.049384 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.049405 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.049422 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.151337 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.151378 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.151394 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.151416 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.151433 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.230674 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 16:05:38.6996103 +0000 UTC Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.254301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.254357 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.254374 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.254400 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.254418 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.357576 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.357907 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.358026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.358164 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.358292 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.461509 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.461785 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.461927 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.462097 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.462291 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.566146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.566199 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.566218 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.566249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.566274 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.669472 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.669540 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.669563 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.669593 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.669617 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.772738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.772801 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.772820 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.772845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.772862 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.875279 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.875351 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.875378 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.875409 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.875432 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.979020 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.979089 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.979113 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.979189 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:58 crc kubenswrapper[4791]: I0217 00:06:58.979209 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:58Z","lastTransitionTime":"2026-02-17T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.083099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.083192 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.083210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.083235 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.083254 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.185957 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.186026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.186043 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.186068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.186086 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.220002 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.220055 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.220055 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:06:59 crc kubenswrapper[4791]: E0217 00:06:59.220288 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.220311 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:06:59 crc kubenswrapper[4791]: E0217 00:06:59.220431 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:06:59 crc kubenswrapper[4791]: E0217 00:06:59.220573 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:06:59 crc kubenswrapper[4791]: E0217 00:06:59.220664 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.231621 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 13:50:16.989418273 +0000 UTC Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.289508 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.289563 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.289588 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.289617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.289636 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.393002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.393063 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.393081 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.393110 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.393150 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.496282 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.496367 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.496391 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.496417 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.496439 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.599495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.599541 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.599551 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.599567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.599578 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.701774 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.701839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.701856 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.701882 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.701900 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.804200 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.804267 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.804288 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.804314 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.804332 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.907442 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.907509 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.907525 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.907548 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:06:59 crc kubenswrapper[4791]: I0217 00:06:59.907567 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:06:59Z","lastTransitionTime":"2026-02-17T00:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.010437 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.010493 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.010511 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.010535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.010557 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.113210 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.113276 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.113290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.113310 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.113324 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.215781 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.215847 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.215868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.215895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.215916 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.232908 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 09:04:28.844530185 +0000 UTC Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.318480 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.318527 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.318543 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.318566 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.318585 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.418278 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.418353 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.418374 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.418398 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.418416 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.442983 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.457603 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.457670 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.457707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.457738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.457760 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.474930 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.479783 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.479836 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.479854 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.479878 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.479900 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.494769 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.499859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.499930 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.499950 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.499975 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.499997 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.518260 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.524165 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.524264 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.524290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.524321 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.524340 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.543198 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:00 crc kubenswrapper[4791]: E0217 00:07:00.543554 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.545627 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.545705 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.545724 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.545748 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.545766 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.649058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.649149 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.649168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.649196 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.649216 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.753320 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.753386 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.753407 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.753431 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.753449 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.856046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.856138 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.856157 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.856183 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.856200 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.959721 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.959777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.959794 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.959818 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:00 crc kubenswrapper[4791]: I0217 00:07:00.959839 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:00Z","lastTransitionTime":"2026-02-17T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.063076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.063204 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.063228 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.063255 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.063273 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.166511 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.166572 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.166592 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.166617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.166635 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.219792 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.219819 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.219956 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:01 crc kubenswrapper[4791]: E0217 00:07:01.220222 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:01 crc kubenswrapper[4791]: E0217 00:07:01.220388 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:01 crc kubenswrapper[4791]: E0217 00:07:01.220505 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.220863 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:01 crc kubenswrapper[4791]: E0217 00:07:01.220999 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.233326 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 16:47:28.362032751 +0000 UTC Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.272437 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.272512 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.272533 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.272568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.272590 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.375944 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.376026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.376061 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.376086 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.376111 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.479292 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.479351 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.479369 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.479392 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.479482 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.582049 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.582161 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.582180 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.582206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.582224 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.685891 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.685968 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.685992 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.686024 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.686045 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.788329 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.788381 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.788399 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.788422 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.788439 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.891498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.891556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.891574 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.891629 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.891649 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.994699 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.994755 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.994772 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.994798 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:01 crc kubenswrapper[4791]: I0217 00:07:01.994815 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:01Z","lastTransitionTime":"2026-02-17T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.097618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.097664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.097677 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.097697 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.097714 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.201594 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.201634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.201646 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.201663 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.201675 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.234339 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 22:06:16.321339476 +0000 UTC Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.304895 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.304984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.305014 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.305055 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.305087 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.408762 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.408815 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.408829 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.408848 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.408863 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.511767 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.511830 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.511845 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.511863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.512186 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.615463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.615569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.615586 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.616024 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.616066 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.718870 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.718931 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.718949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.718971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.718988 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.822239 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.822328 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.822344 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.822368 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.822386 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.925763 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.925841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.925858 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.926349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:02 crc kubenswrapper[4791]: I0217 00:07:02.926398 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:02Z","lastTransitionTime":"2026-02-17T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.029612 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.029675 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.029693 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.029718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.029736 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.132651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.132708 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.132725 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.132747 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.132766 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.219972 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.220091 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.220388 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:03 crc kubenswrapper[4791]: E0217 00:07:03.220371 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.220488 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:03 crc kubenswrapper[4791]: E0217 00:07:03.220703 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:03 crc kubenswrapper[4791]: E0217 00:07:03.220885 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:03 crc kubenswrapper[4791]: E0217 00:07:03.221041 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.235521 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.235582 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.235605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.235634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.235656 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.236316 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 03:56:54.300464626 +0000 UTC Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.240374 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.261142 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.284535 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.304097 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.323593 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.339388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.339445 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.339497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.339530 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.339549 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.343624 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.366096 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.382030 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.398611 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.431462 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.442386 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.442427 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.442445 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.442468 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.442485 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.451460 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.471303 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.488684 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.512774 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:55Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.182069 6791 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.182037 6791 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.184381 6791 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184540 6791 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184971 6791 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185048 6791 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185070 6791 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.531893 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.545702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.545775 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.545794 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.545814 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.545825 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.549930 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.571317 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.588009 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.606498 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.648364 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.648417 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.648436 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.648459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.648478 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.751351 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.751412 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.751430 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.751495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.751516 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.854167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.854206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.854217 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.854257 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.854270 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.957832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.957931 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.957949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.957974 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:03 crc kubenswrapper[4791]: I0217 00:07:03.957993 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:03Z","lastTransitionTime":"2026-02-17T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.060544 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.060607 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.060626 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.060655 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.060674 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.164642 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.164722 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.164745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.164777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.164805 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.237190 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 00:09:14.774794937 +0000 UTC Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.268039 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.268146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.268173 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.268205 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.268230 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.372289 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.372361 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.372379 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.372404 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.372423 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.476069 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.476179 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.476205 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.476233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.476255 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.579360 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.579423 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.579440 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.579467 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.579488 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.683083 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.683173 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.683190 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.683215 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.683243 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.786818 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.786885 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.786908 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.786937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.786958 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.890032 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.890095 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.890147 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.890177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.890195 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.992667 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.992735 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.992754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.992779 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:04 crc kubenswrapper[4791]: I0217 00:07:04.992796 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:04Z","lastTransitionTime":"2026-02-17T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.096197 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.096260 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.096278 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.096302 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.096320 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.199026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.199072 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.199088 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.199103 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.199161 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.219661 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.219726 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.219741 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.219697 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:05 crc kubenswrapper[4791]: E0217 00:07:05.219871 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:05 crc kubenswrapper[4791]: E0217 00:07:05.219986 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:05 crc kubenswrapper[4791]: E0217 00:07:05.220227 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:05 crc kubenswrapper[4791]: E0217 00:07:05.220341 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.237842 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:58:01.226957729 +0000 UTC Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.301980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.302038 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.302057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.302081 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.302099 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.405904 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.405971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.405989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.406018 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.406039 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.510207 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.510272 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.510293 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.510318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.510337 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.613779 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.613846 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.613863 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.613893 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.613910 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.716596 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.716663 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.716687 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.716713 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.716731 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.819237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.819301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.819319 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.819343 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.819361 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.923422 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.923485 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.923502 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.923524 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:05 crc kubenswrapper[4791]: I0217 00:07:05.923541 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:05Z","lastTransitionTime":"2026-02-17T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.026534 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.026856 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.027092 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.027353 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.027571 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.130882 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.130937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.130958 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.130987 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.131009 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.234406 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.234459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.234470 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.234488 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.234499 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.238640 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 23:06:10.047242762 +0000 UTC Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.337091 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.337469 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.337634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.337833 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.338052 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.441715 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.441791 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.441809 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.441836 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.441855 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.544695 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.545223 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.545393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.545536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.545679 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.649199 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.649303 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.649326 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.649356 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.649379 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.753893 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.753975 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.753998 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.754027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.754049 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.856579 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.856662 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.856680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.856704 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.856722 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.960172 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.960238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.960259 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.960285 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:06 crc kubenswrapper[4791]: I0217 00:07:06.960310 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:06Z","lastTransitionTime":"2026-02-17T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.001904 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.002002 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.001979697 +0000 UTC m=+148.481492234 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.063155 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.063233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.063252 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.063280 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.063300 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.102861 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.102926 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.102962 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.103017 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103258 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103282 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103335 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103363 4791 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103455 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.103424123 +0000 UTC m=+148.582936700 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103300 4791 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103571 4791 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103256 4791 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103256 4791 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103658 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.10363427 +0000 UTC m=+148.583146847 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103702 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.103677431 +0000 UTC m=+148.583189998 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.103727 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.103715133 +0000 UTC m=+148.583227700 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.166223 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.166298 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.166319 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.166344 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.166366 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.219291 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.219404 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.219308 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.219481 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.219403 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.219610 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.219700 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:07 crc kubenswrapper[4791]: E0217 00:07:07.219847 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.238858 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 11:40:53.327536662 +0000 UTC Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.269520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.269574 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.269590 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.269613 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.269629 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.374437 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.374841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.374866 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.374889 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.374907 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.478392 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.478476 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.478496 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.478518 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.478617 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.581272 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.581330 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.581341 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.581359 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.581375 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.683991 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.684070 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.684137 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.684175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.684197 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.787499 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.787562 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.787585 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.787614 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.787637 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.890265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.890353 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.890428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.890454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.890472 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.993771 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.993842 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.993864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.993894 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:07 crc kubenswrapper[4791]: I0217 00:07:07.993920 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:07Z","lastTransitionTime":"2026-02-17T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.097042 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.097096 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.097141 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.097165 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.097184 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.199960 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.200032 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.200069 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.200099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.200165 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.238992 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 06:09:07.088831008 +0000 UTC Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.302955 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.303057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.303080 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.303149 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.303174 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.406787 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.406864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.406888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.406919 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.406941 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.510190 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.510277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.510305 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.510337 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.510360 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.614074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.614162 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.614181 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.614203 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.614220 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.717909 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.717971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.717989 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.718009 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.718024 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.820737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.820849 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.820927 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.821006 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.821039 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.924839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.924917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.924939 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.924970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:08 crc kubenswrapper[4791]: I0217 00:07:08.924994 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:08Z","lastTransitionTime":"2026-02-17T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.028377 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.028432 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.028450 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.028475 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.028492 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.131715 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.131776 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.131799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.131829 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.131853 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.219830 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.219959 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.220245 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.220289 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:09 crc kubenswrapper[4791]: E0217 00:07:09.220403 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:09 crc kubenswrapper[4791]: E0217 00:07:09.220645 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:09 crc kubenswrapper[4791]: E0217 00:07:09.220974 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:09 crc kubenswrapper[4791]: E0217 00:07:09.221146 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.234193 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.234245 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.234264 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.234290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.234313 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.239492 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 14:07:13.268386576 +0000 UTC Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.337834 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.337910 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.338017 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.338057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.338079 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.440820 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.440899 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.440924 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.440953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.440972 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.544503 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.544568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.544585 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.544610 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.544629 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.648191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.648266 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.648285 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.648311 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.648328 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.750391 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.750444 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.750456 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.750474 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.750485 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.853782 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.853858 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.853885 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.853916 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.853942 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.956752 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.956796 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.956810 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.956827 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:09 crc kubenswrapper[4791]: I0217 00:07:09.956838 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:09Z","lastTransitionTime":"2026-02-17T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.059664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.059727 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.059745 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.059768 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.059786 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.162568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.162633 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.162661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.162691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.162711 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.239920 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 20:52:35.573900661 +0000 UTC Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.265520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.265593 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.265615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.265644 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.265666 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.368567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.368657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.368675 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.368704 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.368730 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.471718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.471832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.471853 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.471883 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.471904 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.557871 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.557948 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.557972 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.557999 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.558020 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.573101 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.577570 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.577609 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.577617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.577632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.577640 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.591637 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.596620 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.596690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.596701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.596720 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.596733 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.610824 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.615246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.615336 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.615349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.615370 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.615384 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.636906 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.641569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.641749 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.641878 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.642002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.642151 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.656558 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:10Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:10 crc kubenswrapper[4791]: E0217 00:07:10.656778 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.658820 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.659000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.659256 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.659400 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.659518 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.762267 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.762337 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.762354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.762379 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.762399 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.865560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.865618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.865635 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.865659 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.865678 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.969201 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.969268 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.969292 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.969322 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:10 crc kubenswrapper[4791]: I0217 00:07:10.969346 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:10Z","lastTransitionTime":"2026-02-17T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.072619 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.072949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.073100 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.073334 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.073467 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.177068 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.177096 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.177124 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.177137 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.177146 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.220264 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.220407 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.220445 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:11 crc kubenswrapper[4791]: E0217 00:07:11.220610 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.220641 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:11 crc kubenswrapper[4791]: E0217 00:07:11.220787 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:11 crc kubenswrapper[4791]: E0217 00:07:11.220896 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:11 crc kubenswrapper[4791]: E0217 00:07:11.220985 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.240061 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:30:07.716223242 +0000 UTC Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.279905 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.280074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.280099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.280172 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.280188 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.383084 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.383179 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.383196 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.383214 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.383226 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.486064 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.486182 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.486209 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.486238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.486261 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.591357 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.591429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.591448 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.591477 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.591499 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.694564 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.694634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.694657 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.694682 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.694704 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.798029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.798151 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.798175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.798206 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.798227 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.905232 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.905295 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.905313 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.905337 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:11 crc kubenswrapper[4791]: I0217 00:07:11.905354 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:11Z","lastTransitionTime":"2026-02-17T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.008917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.008990 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.009009 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.009034 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.009052 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.112196 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.112249 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.112262 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.112280 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.112291 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.214603 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.214690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.214710 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.214742 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.214782 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.221355 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:07:12 crc kubenswrapper[4791]: E0217 00:07:12.221712 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.240232 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 03:40:36.970146783 +0000 UTC Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.318547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.318605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.318620 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.318638 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.318653 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.421260 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.421304 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.421312 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.421327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.421339 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.523633 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.523699 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.523721 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.523750 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.523773 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.627391 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.627450 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.627469 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.627493 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.627513 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.731161 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.731237 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.731260 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.731291 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.731316 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.834695 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.834818 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.834887 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.834923 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.834948 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.938393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.938459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.938477 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.938503 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:12 crc kubenswrapper[4791]: I0217 00:07:12.938522 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:12Z","lastTransitionTime":"2026-02-17T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.042916 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.042977 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.042995 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.043019 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.043039 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.150954 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.151026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.151047 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.151075 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.151094 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.219961 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.220031 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.220061 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.219980 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:13 crc kubenswrapper[4791]: E0217 00:07:13.220265 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:13 crc kubenswrapper[4791]: E0217 00:07:13.221486 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:13 crc kubenswrapper[4791]: E0217 00:07:13.221628 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:13 crc kubenswrapper[4791]: E0217 00:07:13.221364 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.241210 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 04:00:47.003695596 +0000 UTC Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.253641 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.253679 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.253691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.253707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.253718 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.261032 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.278895 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.296001 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.310022 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.342437 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:55Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.182069 6791 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.182037 6791 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.184381 6791 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184540 6791 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184971 6791 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185048 6791 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185070 6791 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.355839 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.356194 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.356368 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.356508 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.356617 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.360875 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.376669 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.393271 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.405798 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.427541 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.447634 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.460376 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.460426 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.460444 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.460470 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.460491 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.472151 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.493364 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.509661 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.528097 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.547684 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.565348 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.565403 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.565426 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.565454 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.565472 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.566711 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.583690 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.602229 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.669590 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.669664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.669682 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.669707 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.669726 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.773002 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.773052 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.773069 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.773092 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.773193 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.875222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.875641 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.875970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.876298 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.876999 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.980049 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.980086 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.980129 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.980152 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:13 crc kubenswrapper[4791]: I0217 00:07:13.980167 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:13Z","lastTransitionTime":"2026-02-17T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.082917 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.083235 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.083382 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.083516 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.083671 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.186414 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.186484 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.186507 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.186539 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.186562 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.241484 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 03:31:02.991883163 +0000 UTC Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.322775 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.323208 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.324243 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.324342 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.324375 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.426940 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.427003 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.427025 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.427054 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.427078 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.530966 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.531341 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.531433 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.531545 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.532139 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.635639 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.635683 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.635694 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.635711 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.635722 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.738403 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.738465 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.738483 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.738507 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.738541 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.841880 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.841954 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.841975 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.842001 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.842023 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.945502 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.945571 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.945595 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.945625 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:14 crc kubenswrapper[4791]: I0217 00:07:14.945651 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:14Z","lastTransitionTime":"2026-02-17T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.048735 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.048793 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.048812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.048835 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.048860 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.151683 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.151812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.151831 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.151854 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.151872 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.220294 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:15 crc kubenswrapper[4791]: E0217 00:07:15.220522 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.220560 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.220576 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.220601 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:15 crc kubenswrapper[4791]: E0217 00:07:15.220729 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:15 crc kubenswrapper[4791]: E0217 00:07:15.220854 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:15 crc kubenswrapper[4791]: E0217 00:07:15.221268 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.242206 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 08:06:26.786521788 +0000 UTC Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.254187 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.254242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.254253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.254273 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.254286 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.357191 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.357247 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.357264 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.357284 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.357300 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.461407 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.461452 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.461463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.461485 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.461499 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.564958 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.565027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.565045 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.565084 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.565103 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.667797 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.667866 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.667889 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.667918 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.667940 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.770983 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.771054 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.771075 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.771100 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.771162 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.874563 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.874616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.874634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.874656 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.874673 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.978049 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.978133 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.978151 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.978177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:15 crc kubenswrapper[4791]: I0217 00:07:15.978200 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:15Z","lastTransitionTime":"2026-02-17T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.081681 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.081715 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.081729 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.081746 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.081758 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.185490 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.185555 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.185573 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.185598 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.185617 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.242822 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 20:16:20.912475175 +0000 UTC Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.287693 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.287737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.287753 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.287777 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.287793 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.391289 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.391391 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.391411 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.391435 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.391493 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.494172 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.494231 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.494248 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.494274 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.494285 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.596691 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.596754 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.596774 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.596799 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.596816 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.699295 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.699336 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.699344 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.699358 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.699367 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.802026 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.802074 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.802085 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.802100 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.802143 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.905178 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.905326 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.905349 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.905377 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:16 crc kubenswrapper[4791]: I0217 00:07:16.905400 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:16Z","lastTransitionTime":"2026-02-17T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.008476 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.008550 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.008569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.008596 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.008614 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.111886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.111944 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.111961 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.111990 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.112040 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.214312 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.214388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.214412 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.214442 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.214459 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.219714 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.219763 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.219725 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:17 crc kubenswrapper[4791]: E0217 00:07:17.219909 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:17 crc kubenswrapper[4791]: E0217 00:07:17.220147 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.220213 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:17 crc kubenswrapper[4791]: E0217 00:07:17.220479 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:17 crc kubenswrapper[4791]: E0217 00:07:17.220804 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.243081 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 06:51:41.86677935 +0000 UTC Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.317856 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.317920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.317938 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.317962 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.317980 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.422911 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.422967 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.422984 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.423045 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.423104 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.526731 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.526794 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.526814 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.526838 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.526855 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.629351 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.629424 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.629441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.629463 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.629481 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.732455 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.732523 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.732534 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.732550 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.732683 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.835159 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.835217 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.835235 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.835258 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.835274 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.937749 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.937815 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.937832 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.937857 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:17 crc kubenswrapper[4791]: I0217 00:07:17.937878 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:17Z","lastTransitionTime":"2026-02-17T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.040876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.040938 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.040959 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.040985 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.041004 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.143513 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.143585 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.143609 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.143639 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.143660 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.243502 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 09:31:38.254267248 +0000 UTC Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.247146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.247200 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.247268 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.247294 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.247374 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.350542 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.350613 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.350633 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.350661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.350681 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.454605 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.454685 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.454702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.454729 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.454749 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.558388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.558475 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.558510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.558542 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.558578 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.661651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.661718 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.661737 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.661760 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.661776 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.765144 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.765192 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.765231 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.765250 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.765262 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.867365 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.867458 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.867482 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.867504 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.867520 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.969476 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.969526 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.969543 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.969568 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:18 crc kubenswrapper[4791]: I0217 00:07:18.969584 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:18Z","lastTransitionTime":"2026-02-17T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.072314 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.072352 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.072370 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.072387 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.072397 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.175563 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.175612 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.175632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.175661 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.175684 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.219992 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.220035 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.220069 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.220020 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:19 crc kubenswrapper[4791]: E0217 00:07:19.220219 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:19 crc kubenswrapper[4791]: E0217 00:07:19.220416 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:19 crc kubenswrapper[4791]: E0217 00:07:19.220554 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:19 crc kubenswrapper[4791]: E0217 00:07:19.220715 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.244003 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 23:58:18.298683194 +0000 UTC Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.277776 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.277873 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.277899 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.277928 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.277964 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.381460 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.381524 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.381549 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.381584 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.381609 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.484934 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.485015 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.485050 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.485085 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.485138 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.588536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.588614 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.588625 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.588642 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.588655 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.691224 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.691290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.691316 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.691348 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.691371 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.793716 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.793776 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.793794 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.793817 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.793834 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.896864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.896939 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.896951 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.896970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:19 crc kubenswrapper[4791]: I0217 00:07:19.896983 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:19Z","lastTransitionTime":"2026-02-17T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.000472 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.000548 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.000572 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.000597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.000616 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.104942 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.105015 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.105034 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.105058 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.105076 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.208094 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.208180 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.208198 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.208222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.208241 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.245131 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 08:35:52.213486624 +0000 UTC Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.311556 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.311654 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.311674 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.312087 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.312173 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.420616 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.420677 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.420702 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.420731 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.420751 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.523222 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.523288 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.523308 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.523333 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.523352 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.625634 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.625904 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.626000 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.626080 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.626164 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.729577 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.729650 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.729669 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.729692 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.729709 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.833012 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.833180 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.833209 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.833238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.833263 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.936665 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.936726 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.936749 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.936781 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.936803 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.971475 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.971535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.971552 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.971578 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.971597 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:20 crc kubenswrapper[4791]: E0217 00:07:20.993088 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.998724 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.999010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.999248 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.999428 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:20 crc kubenswrapper[4791]: I0217 00:07:20.999572 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:20Z","lastTransitionTime":"2026-02-17T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.022452 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.028367 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.028445 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.028472 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.028502 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.028524 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.050645 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.056388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.056438 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.056460 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.056488 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.056509 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.078499 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.083776 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.083841 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.083862 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.083886 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.083904 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.105328 4791 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T00:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24423ac-3d71-4c2f-893f-f52232a36e88\\\",\\\"systemUUID\\\":\\\"0568b345-f1c6-4fd9-8232-7bcd76fcbb73\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.105653 4791 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.107705 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.107769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.107795 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.107824 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.107845 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.210467 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.210535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.210558 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.210589 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.210611 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.220063 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.220325 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.220391 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.220460 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.220407 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.222223 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.222483 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.222722 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.245596 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 07:01:50.025828653 +0000 UTC Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.313468 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.313520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.313538 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.313562 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.313580 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.415936 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.415995 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.416012 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.416036 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.416053 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.518741 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.518784 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.518795 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.518815 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.518827 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.622534 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.622595 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.622613 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.622638 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.622661 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.667744 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.668043 4791 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:07:21 crc kubenswrapper[4791]: E0217 00:07:21.668197 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs podName:1d97cf45-2324-494c-839f-6f264eba3828 nodeName:}" failed. No retries permitted until 2026-02-17 00:08:25.668161528 +0000 UTC m=+163.147674085 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs") pod "network-metrics-daemon-6x28n" (UID: "1d97cf45-2324-494c-839f-6f264eba3828") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.726178 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.726242 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.726266 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.726297 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.726321 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.829510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.829590 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.829630 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.829664 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.829686 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.932791 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.932846 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.932864 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.932888 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:21 crc kubenswrapper[4791]: I0217 00:07:21.932905 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:21Z","lastTransitionTime":"2026-02-17T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.035728 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.035785 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.035803 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.035827 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.035844 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.138474 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.138533 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.138555 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.138587 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.138610 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.241453 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.241511 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.241526 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.241543 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.241557 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.246712 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 20:54:20.32516322 +0000 UTC Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.344609 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.344699 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.344725 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.344756 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.344781 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.448257 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.448318 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.448332 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.448393 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.448499 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.552468 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.552527 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.552567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.552595 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.552610 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.656650 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.656722 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.656734 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.656756 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.656771 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.760978 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.761051 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.761072 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.761104 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.761155 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.863741 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.863904 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.863925 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.863953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.863973 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.966798 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.966859 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.966876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.966900 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:22 crc kubenswrapper[4791]: I0217 00:07:22.966920 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:22Z","lastTransitionTime":"2026-02-17T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.069401 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.069497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.069527 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.069561 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.069584 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.172980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.173043 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.173059 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.173082 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.173101 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.220361 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.220445 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.220446 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:23 crc kubenswrapper[4791]: E0217 00:07:23.220624 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.220713 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:23 crc kubenswrapper[4791]: E0217 00:07:23.221022 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:23 crc kubenswrapper[4791]: E0217 00:07:23.221179 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:23 crc kubenswrapper[4791]: E0217 00:07:23.221287 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.247606 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 04:38:05.577008609 +0000 UTC Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.248290 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c4455fb-f818-4b8a-92dc-ac60431b5ff8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T00:06:02Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 00:05:56.906072 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 00:05:56.909262 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-775847740/tls.crt::/tmp/serving-cert-775847740/tls.key\\\\\\\"\\\\nI0217 00:06:02.287760 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 00:06:02.298834 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 00:06:02.298877 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 00:06:02.298925 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 00:06:02.298939 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 00:06:02.313286 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 00:06:02.313428 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 00:06:02.313440 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313582 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 00:06:02.313643 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 00:06:02.313692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 00:06:02.313736 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 00:06:02.313779 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 00:06:02.314827 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.269157 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea06e59c-dcd7-4a9b-a509-532df51d9d74\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610d0d54a0fe8d598bf362b3af4429fa8e4c434c7f22ba44817857f01fd0b10f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3893c0cf61ac0a3d79b959285e5a9d11cb5137bf23bc6f40d63913a1a506eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4aded7b68d65669bdf0c8cc0c7ffcf0cdc39933913c9acbe7eda9ad357a20c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.276567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.276652 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.276671 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.276701 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.276719 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.291705 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27fc2f27cc0e7a239baea43a45952284ad669508cd1e30a1abbd8be71cc56845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.341278 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.379831 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.380920 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.380961 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.380975 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.380997 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.381013 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.394479 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3a228-86d6-4d54-ad63-0d36c9d59af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68b9ca2ddd906b806fbba6cf60f5da93e818789835bbebb64394c2e49dde5add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c6rgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9klkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.405983 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1832e521-1715-432d-917c-bc0ab725e92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6482797ad75c032ff3c59d7bc47afd9c215720de8bb76506421ebb686ee29800\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f54cf1746cd7a9cc213dadaf7dc256f06b16d9062fba3ddcfc389cb2ff5dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9h57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8tdlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.425222 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60622b56-3585-4c82-8162-8b903c1881d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daa50652bba2e559c1126202a0217ed21e68e47e8d042b130a62024463a76b1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63110da7b399d7d73373f58e7429947a56244077d278322412a8c92bb5c9dd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc3816fc7f4c359ad9693a04427933f9a93389c2e1eb1e81bb8c627ee19f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0033160715bc9b8ac3f661797d62cb9d175b60474464216081ea6736799adb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acc51b20aa4e2218b0da28057f006fae7be01962a5c40f7d4adaf7a545e7c3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4305cd3cc9b99a1d1e1c9fb462c2f6d2eb98b79d0e61eaa51461d1ab2ee568d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa26c00c3bf429be00c0163a7151f56790d059256d99c114be0d1ec93cc68c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85ede530f80e9580356678c8df8448e21b13853f994bd3fc35536707e124ae3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.437327 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dl4gt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b819236-9682-4ef9-8653-516f45335793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68238de0dde122f0d40ee569447834db4e6ad1688b2dbaccfe2e181f21dea265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dl4gt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.452553 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-299s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1104c109-74aa-4fc4-8a1b-914a0d5803a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:50Z\\\",\\\"message\\\":\\\"2026-02-17T00:06:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b\\\\n2026-02-17T00:06:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ea86518-6146-4794-a40e-3e62aefbaa8b to /host/opt/cni/bin/\\\\n2026-02-17T00:06:05Z [verbose] multus-daemon started\\\\n2026-02-17T00:06:05Z [verbose] Readiness Indicator file check\\\\n2026-02-17T00:06:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-299s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.472548 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49cb15b5c5ae27cc9c262b7e82636c2f2d6f3e07796e624423d546977dcb504d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44ea7c26c47027d5ffb4fcd4c55d8e306da69b62b2d747a1662bb543a839a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.484402 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.484471 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.484497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.484535 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.484564 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.502578 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7fe508f-1e8c-4da7-8f99-108e73cb3791\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T00:06:55Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.182069 6791 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.182037 6791 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 00:06:55.184381 6791 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184540 6791 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 00:06:55.184971 6791 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185048 6791 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 00:06:55.185070 6791 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r26vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hldzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.521896 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k5kxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f0a7811-6a89-456b-95ea-6c8e698479dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6374cd73fb97a5312a7ff1415d38c0ba0715d926d84e65e9661861204a844b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mnb2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k5kxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.539979 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6x28n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d97cf45-2324-494c-839f-6f264eba3828\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tnzq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6x28n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.558339 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307ada4d-5f23-4c2f-b915-110f48b354f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0271da50c7247334982fae6e28a6f872507b254be16d44d16caa4dd8803c55d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5862918ee8e19c933c26da9af8cabcfb3a381255459915f89d35f97b7dcb646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://273489c73541499e670244ce9e3deb5f726a4b53b67e1d2ff4157c78a5890846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e521383a6bc1b3d1508d862e6d950187345c86d8d464472a626463244acd90c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.575080 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9e8d58-ba46-4194-9f8d-b06c8c126552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded890a1b2fba38ade24ab70206558ce4a87f423c6480bf56ef3e6836dc45e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9735d49b4f207d4da5f762dc10e55c7747f8af743a7095c32ef0c28331edc6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:05:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.588059 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.588157 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.588177 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.588202 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.588224 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.597317 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18e827d36dd4eed6c91959a77077211612b83d6d8f2c960df7ddac773417de04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.617568 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.641331 4791 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8stwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eab5901c-ba92-4f20-9960-ac7cfd67b25a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T00:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb66ce24b01a2606404b80628d1811aa664a8a3bf8ca9e1247607c76df7d53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T00:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40bd80e7aceeff479c28b9f076841ef17e28c3456eae12b16d89d5fd85b8768c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c01edae6b82358749100a2924ad6c8b6657c718df384498a6ca1284ec1afe737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19ebc8f78d93de215fe8e08905dc7361715bfbd8402677ea5d0a30d5ab72a3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74fc063b3939f1d21ffbca3501f33029d7d7426e79a8793f98e796cbfc3dfd08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e82730fe4863dbc680d9d4dedb023bf52dfa7ca7154c1361554652f00b7fc90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a65771b660e8b81400dc7d784a8181dc0eee6f63a456c0a6964fe5e22d8db4b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T00:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T00:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nwqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T00:06:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8stwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.691713 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.691769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.691788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.691813 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.691833 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.794533 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.794585 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.794597 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.794619 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.794632 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.896730 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.896805 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.896831 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.896860 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:23 crc kubenswrapper[4791]: I0217 00:07:23.896879 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:23Z","lastTransitionTime":"2026-02-17T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.000451 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.000510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.000528 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.000554 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.000573 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.104357 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.104440 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.104459 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.104533 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.104557 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.207994 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.208096 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.208183 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.208219 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.208240 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.248479 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 02:45:32.286809276 +0000 UTC Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.310929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.310993 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.311010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.311033 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.311050 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.414189 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.414283 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.414301 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.414327 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.414343 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.517166 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.517230 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.517253 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.517277 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.517294 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.620596 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.620651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.620666 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.620690 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.620707 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.724283 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.724348 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.724366 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.724390 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.724407 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.827921 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.827959 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.827968 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.827982 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.827991 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.930729 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.930884 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.930911 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.930938 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:24 crc kubenswrapper[4791]: I0217 00:07:24.930956 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:24Z","lastTransitionTime":"2026-02-17T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.033980 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.034030 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.034046 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.034071 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.034085 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.137417 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.137500 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.137525 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.137560 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.137584 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.219991 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.220085 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:25 crc kubenswrapper[4791]: E0217 00:07:25.220191 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.220250 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:25 crc kubenswrapper[4791]: E0217 00:07:25.220439 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.220503 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:25 crc kubenswrapper[4791]: E0217 00:07:25.220556 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:25 crc kubenswrapper[4791]: E0217 00:07:25.220652 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.221710 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:07:25 crc kubenswrapper[4791]: E0217 00:07:25.221956 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hldzt_openshift-ovn-kubernetes(e7fe508f-1e8c-4da7-8f99-108e73cb3791)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.240672 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.240755 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.240778 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.240812 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.240833 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.248900 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 01:10:54.284363681 +0000 UTC Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.344062 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.344167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.344187 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.344215 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.344238 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.446968 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.447057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.447076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.447100 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.447147 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.549851 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.549997 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.550021 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.550076 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.550099 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.653495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.653555 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.653571 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.653598 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.653620 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.756364 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.756418 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.756437 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.756460 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.756479 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.860029 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.860180 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.860209 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.860238 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.860261 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.964010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.964079 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.964134 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.964167 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:25 crc kubenswrapper[4791]: I0217 00:07:25.964188 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:25Z","lastTransitionTime":"2026-02-17T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.067036 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.067095 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.067146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.067174 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.067191 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.170517 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.170553 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.170562 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.170578 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.170587 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.249421 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 23:31:26.020843408 +0000 UTC Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.274263 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.274321 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.274338 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.274362 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.274381 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.376926 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.376992 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.377013 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.377038 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.377057 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.480873 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.481010 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.481081 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.481168 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.481197 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.584876 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.584929 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.584947 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.584971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.584989 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.688295 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.688354 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.688372 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.688396 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.688412 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.791437 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.791497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.791515 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.791542 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.791561 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.893761 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.893854 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.893874 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.893937 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.893959 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.997340 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.997401 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.997418 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.997441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:26 crc kubenswrapper[4791]: I0217 00:07:26.997459 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:26Z","lastTransitionTime":"2026-02-17T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.100552 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.100614 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.100632 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.100656 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.100676 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.203736 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.203823 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.203840 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.203866 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.203884 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.219345 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.219417 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.219444 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:27 crc kubenswrapper[4791]: E0217 00:07:27.219499 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.219600 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:27 crc kubenswrapper[4791]: E0217 00:07:27.219731 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:27 crc kubenswrapper[4791]: E0217 00:07:27.220083 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:27 crc kubenswrapper[4791]: E0217 00:07:27.220600 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.250360 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 18:46:52.802816865 +0000 UTC Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.307037 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.307099 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.307146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.307171 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.307187 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.410027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.410127 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.410150 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.410175 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.410193 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.513654 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.513721 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.513738 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.513769 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.513787 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.617134 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.617198 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.617220 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.617281 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.617306 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.720004 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.720057 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.720073 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.720102 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.720167 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.823588 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.823637 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.823655 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.823680 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.823700 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.926817 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.926857 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.926868 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.926884 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:27 crc kubenswrapper[4791]: I0217 00:07:27.926896 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:27Z","lastTransitionTime":"2026-02-17T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.029441 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.029510 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.029533 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.029567 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.029588 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.133892 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.133967 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.133991 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.134025 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.134050 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.237765 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.237826 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.237850 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.237877 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.237898 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.250506 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 01:40:06.441116385 +0000 UTC Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.340540 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.340589 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.340601 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.340618 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.340631 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.443341 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.443388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.443400 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.443420 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.443432 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.546835 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.546897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.546916 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.546948 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.546971 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.650226 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.650315 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.650341 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.650379 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.650400 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.753411 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.753474 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.753495 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.753520 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.753537 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.856420 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.856487 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.856505 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.856528 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.856546 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.960337 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.960421 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.960440 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.960469 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:28 crc kubenswrapper[4791]: I0217 00:07:28.960487 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:28Z","lastTransitionTime":"2026-02-17T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.062885 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.062949 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.062967 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.062994 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.063018 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.165719 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.165788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.165808 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.165834 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.165852 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.220298 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.220442 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.220348 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:29 crc kubenswrapper[4791]: E0217 00:07:29.220539 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.220375 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:29 crc kubenswrapper[4791]: E0217 00:07:29.220960 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:29 crc kubenswrapper[4791]: E0217 00:07:29.220998 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:29 crc kubenswrapper[4791]: E0217 00:07:29.221041 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.250654 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:50:44.719873384 +0000 UTC Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.268897 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.268970 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.268993 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.269027 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.269052 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.371948 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.372025 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.372049 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.372082 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.372363 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.476703 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.476788 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.476817 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.476854 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.476879 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.579484 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.579569 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.579617 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.579651 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.579676 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.682366 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.682433 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.682457 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.682487 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.682512 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.785653 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.785698 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.785706 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.785720 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.785729 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.888615 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.888685 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.888706 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.888730 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.888746 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.994600 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.994694 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.994713 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.994736 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:29 crc kubenswrapper[4791]: I0217 00:07:29.994789 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:29Z","lastTransitionTime":"2026-02-17T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.097205 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.097265 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.097283 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.097306 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.097326 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.199806 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.199907 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.199934 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.199964 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.199986 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.251784 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 20:19:28.222833851 +0000 UTC Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.302953 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.302988 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.303016 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.303033 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.303042 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.406403 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.406480 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.406498 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.406536 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.406554 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.509176 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.509246 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.509264 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.509290 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.509307 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.612619 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.612986 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.613268 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.613443 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.613580 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.717165 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.717269 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.717288 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.717310 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.717329 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.820576 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.820971 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.821146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.821323 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.821470 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.924332 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.924388 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.924405 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.924429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:30 crc kubenswrapper[4791]: I0217 00:07:30.924447 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:30Z","lastTransitionTime":"2026-02-17T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.027362 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.027429 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.027446 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.027471 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.027488 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:31Z","lastTransitionTime":"2026-02-17T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.131488 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.131547 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.131564 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.131588 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.131605 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:31Z","lastTransitionTime":"2026-02-17T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.220011 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.220194 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:31 crc kubenswrapper[4791]: E0217 00:07:31.220330 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.220442 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.220527 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:31 crc kubenswrapper[4791]: E0217 00:07:31.220889 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:31 crc kubenswrapper[4791]: E0217 00:07:31.221886 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:31 crc kubenswrapper[4791]: E0217 00:07:31.222030 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.235090 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.235233 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.235261 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.235293 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.235317 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:31Z","lastTransitionTime":"2026-02-17T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.252892 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 22:43:11.010150122 +0000 UTC Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.338439 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.338497 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.338515 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.338541 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.338560 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:31Z","lastTransitionTime":"2026-02-17T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.344040 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.344103 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.344146 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.344173 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.344192 4791 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T00:07:31Z","lastTransitionTime":"2026-02-17T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.409602 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2"] Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.410466 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.412834 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.413020 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.413187 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.413844 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.479049 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podStartSLOduration=89.479031381 podStartE2EDuration="1m29.479031381s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.478930968 +0000 UTC m=+108.958443505" watchObservedRunningTime="2026-02-17 00:07:31.479031381 +0000 UTC m=+108.958543908" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.482889 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.482942 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.483002 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.483041 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.483085 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.495449 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8tdlq" podStartSLOduration=88.495434056 podStartE2EDuration="1m28.495434056s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.495234549 +0000 UTC m=+108.974747076" watchObservedRunningTime="2026-02-17 00:07:31.495434056 +0000 UTC m=+108.974946583" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.510591 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.51055071 podStartE2EDuration="1m29.51055071s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.510432736 +0000 UTC m=+108.989945253" watchObservedRunningTime="2026-02-17 00:07:31.51055071 +0000 UTC m=+108.990063247" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.526079 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=85.526059165 podStartE2EDuration="1m25.526059165s" podCreationTimestamp="2026-02-17 00:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.526035875 +0000 UTC m=+109.005548412" watchObservedRunningTime="2026-02-17 00:07:31.526059165 +0000 UTC m=+109.005571692" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.542058 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-299s7" podStartSLOduration=89.542035606 podStartE2EDuration="1m29.542035606s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.541910772 +0000 UTC m=+109.021423309" watchObservedRunningTime="2026-02-17 00:07:31.542035606 +0000 UTC m=+109.021548143" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.583784 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.583887 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.583919 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.583969 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.584004 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.584045 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.584080 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.585164 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.591909 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.605385 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jmvl2\" (UID: \"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.628359 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-k5kxc" podStartSLOduration=89.628340629 podStartE2EDuration="1m29.628340629s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.608569706 +0000 UTC m=+109.088082273" watchObservedRunningTime="2026-02-17 00:07:31.628340629 +0000 UTC m=+109.107853156" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.658417 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=88.65839696 podStartE2EDuration="1m28.65839696s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.657824681 +0000 UTC m=+109.137337218" watchObservedRunningTime="2026-02-17 00:07:31.65839696 +0000 UTC m=+109.137909487" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.667953 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dl4gt" podStartSLOduration=89.667931224 podStartE2EDuration="1m29.667931224s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.667670817 +0000 UTC m=+109.147183334" watchObservedRunningTime="2026-02-17 00:07:31.667931224 +0000 UTC m=+109.147443751" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.728611 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=58.728591026 podStartE2EDuration="58.728591026s" podCreationTimestamp="2026-02-17 00:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.711919663 +0000 UTC m=+109.191432220" watchObservedRunningTime="2026-02-17 00:07:31.728591026 +0000 UTC m=+109.208103563" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.729036 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=38.729029 podStartE2EDuration="38.729029s" podCreationTimestamp="2026-02-17 00:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.728543414 +0000 UTC m=+109.208055951" watchObservedRunningTime="2026-02-17 00:07:31.729029 +0000 UTC m=+109.208541537" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.730132 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.746069 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8stwf" podStartSLOduration=89.746051325 podStartE2EDuration="1m29.746051325s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.745857418 +0000 UTC m=+109.225369965" watchObservedRunningTime="2026-02-17 00:07:31.746051325 +0000 UTC m=+109.225563852" Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.915798 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" event={"ID":"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1","Type":"ContainerStarted","Data":"e54a80eac0abbabaa9da307ea9535e11d2e9952af7df45436c5550d2f1b5c91e"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.916167 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" event={"ID":"4a4fffed-0ee5-4d73-8bd9-89fdb9333eb1","Type":"ContainerStarted","Data":"0eda90be92faad7285114516a7f3ceebb90d582221526bb0138bb395d888ec3b"} Feb 17 00:07:31 crc kubenswrapper[4791]: I0217 00:07:31.946458 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jmvl2" podStartSLOduration=89.946434226 podStartE2EDuration="1m29.946434226s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:31.946330222 +0000 UTC m=+109.425842799" watchObservedRunningTime="2026-02-17 00:07:31.946434226 +0000 UTC m=+109.425946763" Feb 17 00:07:32 crc kubenswrapper[4791]: I0217 00:07:32.253639 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 06:58:12.708548406 +0000 UTC Feb 17 00:07:32 crc kubenswrapper[4791]: I0217 00:07:32.254698 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 00:07:32 crc kubenswrapper[4791]: I0217 00:07:32.266744 4791 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 00:07:33 crc kubenswrapper[4791]: I0217 00:07:33.219757 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:33 crc kubenswrapper[4791]: I0217 00:07:33.220370 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:33 crc kubenswrapper[4791]: E0217 00:07:33.222139 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:33 crc kubenswrapper[4791]: I0217 00:07:33.222166 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:33 crc kubenswrapper[4791]: I0217 00:07:33.222208 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:33 crc kubenswrapper[4791]: E0217 00:07:33.222229 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:33 crc kubenswrapper[4791]: E0217 00:07:33.222321 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:33 crc kubenswrapper[4791]: E0217 00:07:33.222475 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:35 crc kubenswrapper[4791]: I0217 00:07:35.219578 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:35 crc kubenswrapper[4791]: I0217 00:07:35.219781 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:35 crc kubenswrapper[4791]: E0217 00:07:35.220612 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:35 crc kubenswrapper[4791]: I0217 00:07:35.220654 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:35 crc kubenswrapper[4791]: I0217 00:07:35.220744 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:35 crc kubenswrapper[4791]: E0217 00:07:35.220882 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:35 crc kubenswrapper[4791]: E0217 00:07:35.221020 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:35 crc kubenswrapper[4791]: E0217 00:07:35.221306 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.935599 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/1.log" Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.936618 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/0.log" Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.936672 4791 generic.go:334] "Generic (PLEG): container finished" podID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" containerID="6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c" exitCode=1 Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.936704 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerDied","Data":"6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c"} Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.936737 4791 scope.go:117] "RemoveContainer" containerID="de79d4bc0901dbad74a20fd419c4726532328347effd404dad5da024782cdeb7" Feb 17 00:07:36 crc kubenswrapper[4791]: I0217 00:07:36.937064 4791 scope.go:117] "RemoveContainer" containerID="6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c" Feb 17 00:07:36 crc kubenswrapper[4791]: E0217 00:07:36.937257 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-299s7_openshift-multus(1104c109-74aa-4fc4-8a1b-914a0d5803a4)\"" pod="openshift-multus/multus-299s7" podUID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" Feb 17 00:07:37 crc kubenswrapper[4791]: I0217 00:07:37.219702 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:37 crc kubenswrapper[4791]: I0217 00:07:37.219852 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:37 crc kubenswrapper[4791]: I0217 00:07:37.219974 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:37 crc kubenswrapper[4791]: I0217 00:07:37.220068 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:37 crc kubenswrapper[4791]: E0217 00:07:37.220390 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:37 crc kubenswrapper[4791]: E0217 00:07:37.220520 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:37 crc kubenswrapper[4791]: E0217 00:07:37.220681 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:37 crc kubenswrapper[4791]: E0217 00:07:37.221366 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:37 crc kubenswrapper[4791]: I0217 00:07:37.942100 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/1.log" Feb 17 00:07:38 crc kubenswrapper[4791]: I0217 00:07:38.221460 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:07:38 crc kubenswrapper[4791]: I0217 00:07:38.946298 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/3.log" Feb 17 00:07:38 crc kubenswrapper[4791]: I0217 00:07:38.949254 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerStarted","Data":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} Feb 17 00:07:38 crc kubenswrapper[4791]: I0217 00:07:38.949661 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:07:38 crc kubenswrapper[4791]: I0217 00:07:38.991171 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podStartSLOduration=96.991157285 podStartE2EDuration="1m36.991157285s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:07:38.989324066 +0000 UTC m=+116.468836593" watchObservedRunningTime="2026-02-17 00:07:38.991157285 +0000 UTC m=+116.470669812" Feb 17 00:07:39 crc kubenswrapper[4791]: I0217 00:07:39.063506 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6x28n"] Feb 17 00:07:39 crc kubenswrapper[4791]: I0217 00:07:39.063630 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:39 crc kubenswrapper[4791]: E0217 00:07:39.063730 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:39 crc kubenswrapper[4791]: I0217 00:07:39.220274 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:39 crc kubenswrapper[4791]: I0217 00:07:39.220307 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:39 crc kubenswrapper[4791]: I0217 00:07:39.220365 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:39 crc kubenswrapper[4791]: E0217 00:07:39.220474 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:39 crc kubenswrapper[4791]: E0217 00:07:39.220592 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:39 crc kubenswrapper[4791]: E0217 00:07:39.220785 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:41 crc kubenswrapper[4791]: I0217 00:07:41.220273 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:41 crc kubenswrapper[4791]: I0217 00:07:41.220358 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:41 crc kubenswrapper[4791]: I0217 00:07:41.220547 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:41 crc kubenswrapper[4791]: I0217 00:07:41.220274 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:41 crc kubenswrapper[4791]: E0217 00:07:41.220532 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:41 crc kubenswrapper[4791]: E0217 00:07:41.220686 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:41 crc kubenswrapper[4791]: E0217 00:07:41.220798 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:41 crc kubenswrapper[4791]: E0217 00:07:41.220924 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.211540 4791 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 00:07:43 crc kubenswrapper[4791]: I0217 00:07:43.219872 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:43 crc kubenswrapper[4791]: I0217 00:07:43.219912 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:43 crc kubenswrapper[4791]: I0217 00:07:43.220006 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.222053 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:43 crc kubenswrapper[4791]: I0217 00:07:43.222170 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.222352 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.222441 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.222596 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:43 crc kubenswrapper[4791]: E0217 00:07:43.343415 4791 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 00:07:45 crc kubenswrapper[4791]: I0217 00:07:45.220401 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:45 crc kubenswrapper[4791]: I0217 00:07:45.220495 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:45 crc kubenswrapper[4791]: I0217 00:07:45.220498 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:45 crc kubenswrapper[4791]: I0217 00:07:45.220600 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:45 crc kubenswrapper[4791]: E0217 00:07:45.220980 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:45 crc kubenswrapper[4791]: E0217 00:07:45.221288 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:45 crc kubenswrapper[4791]: E0217 00:07:45.221409 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:45 crc kubenswrapper[4791]: E0217 00:07:45.221523 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:47 crc kubenswrapper[4791]: I0217 00:07:47.219846 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:47 crc kubenswrapper[4791]: I0217 00:07:47.220021 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:47 crc kubenswrapper[4791]: E0217 00:07:47.220070 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:47 crc kubenswrapper[4791]: I0217 00:07:47.220176 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:47 crc kubenswrapper[4791]: I0217 00:07:47.220190 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:47 crc kubenswrapper[4791]: E0217 00:07:47.220343 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:47 crc kubenswrapper[4791]: E0217 00:07:47.220532 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:47 crc kubenswrapper[4791]: E0217 00:07:47.220663 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:48 crc kubenswrapper[4791]: E0217 00:07:48.344956 4791 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 00:07:49 crc kubenswrapper[4791]: I0217 00:07:49.220020 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:49 crc kubenswrapper[4791]: I0217 00:07:49.220174 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:49 crc kubenswrapper[4791]: I0217 00:07:49.220230 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:49 crc kubenswrapper[4791]: E0217 00:07:49.220268 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:49 crc kubenswrapper[4791]: I0217 00:07:49.220193 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:49 crc kubenswrapper[4791]: E0217 00:07:49.220516 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:49 crc kubenswrapper[4791]: E0217 00:07:49.220587 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:49 crc kubenswrapper[4791]: E0217 00:07:49.220815 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:50 crc kubenswrapper[4791]: I0217 00:07:50.219858 4791 scope.go:117] "RemoveContainer" containerID="6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c" Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.000398 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/1.log" Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.000795 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerStarted","Data":"583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea"} Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.219886 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.219964 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:51 crc kubenswrapper[4791]: E0217 00:07:51.220062 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.220122 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:51 crc kubenswrapper[4791]: I0217 00:07:51.220142 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:51 crc kubenswrapper[4791]: E0217 00:07:51.220299 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:51 crc kubenswrapper[4791]: E0217 00:07:51.220479 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:51 crc kubenswrapper[4791]: E0217 00:07:51.220616 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:53 crc kubenswrapper[4791]: I0217 00:07:53.220177 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:53 crc kubenswrapper[4791]: I0217 00:07:53.220277 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:53 crc kubenswrapper[4791]: E0217 00:07:53.222281 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 00:07:53 crc kubenswrapper[4791]: I0217 00:07:53.222312 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:53 crc kubenswrapper[4791]: I0217 00:07:53.222467 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:53 crc kubenswrapper[4791]: E0217 00:07:53.222644 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 00:07:53 crc kubenswrapper[4791]: E0217 00:07:53.222858 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 00:07:53 crc kubenswrapper[4791]: E0217 00:07:53.223015 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6x28n" podUID="1d97cf45-2324-494c-839f-6f264eba3828" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.219816 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.219910 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.219833 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.220277 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.223772 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.223930 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.223967 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.223977 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.224168 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 00:07:55 crc kubenswrapper[4791]: I0217 00:07:55.224562 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.717059 4791 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.773804 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.774420 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.779831 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.780099 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.780501 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.780548 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.781388 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.781938 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.783521 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.784127 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.784699 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.790626 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.791027 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.791431 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.791698 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.791950 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.792179 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.793463 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-rt865"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.793909 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.793993 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29521440-k6f7k"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.801836 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.809684 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-flvjk"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.811000 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.812331 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-49n75"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.813274 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.814034 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.825973 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.826883 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.827618 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.828427 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.828635 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.829061 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.829797 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.829904 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.829933 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830040 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830063 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830304 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830470 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830562 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830671 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.830810 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.831166 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.832270 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.832907 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.832990 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.834240 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.837717 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5nwz7"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.838406 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.838775 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.839112 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.839218 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.841104 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t5827"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.841657 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.843028 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-frmbv"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.843613 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.845435 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stqb9"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.846102 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.849645 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.849870 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.849938 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850035 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850052 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850095 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850177 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850210 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850283 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850336 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850395 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850426 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850514 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850560 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850610 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850673 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850700 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850766 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850783 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850850 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850864 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850943 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.850960 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851101 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851228 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851271 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851232 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851392 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.851629 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.853347 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.854283 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.856231 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.856387 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.856612 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.857826 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.859195 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.860330 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.870040 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.870431 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.871053 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.871177 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.871364 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.873508 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.874591 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.874861 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.875116 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.875393 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.875692 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.875997 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.876619 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.876956 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.877921 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.878707 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.879090 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.879269 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.881918 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.882049 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.898709 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.898994 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899065 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899199 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899269 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899359 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899451 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899585 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.899641 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.900116 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.900217 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.900273 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.900486 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.900886 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.901191 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.901205 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.901400 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.902249 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.903495 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.903644 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.903749 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.903763 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.903875 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.904055 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.904473 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.904581 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.909565 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.910078 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.910391 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.910946 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.911102 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.911206 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.911414 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.912343 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.912744 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.914330 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.915251 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.915676 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7pqm8"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.916306 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.924337 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.924753 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.925824 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.926009 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.926464 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.926787 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.927032 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.927088 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.927458 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935102 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-config\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935160 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d469ce1-e7ed-4826-a378-0de16f2b4e56-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935186 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-oauth-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935231 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-auth-proxy-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935256 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935272 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935291 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c752f56-7754-4718-aea5-cb41d6ac4253-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935310 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-serving-cert\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935340 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9pgd\" (UniqueName: \"kubernetes.io/projected/3d469ce1-e7ed-4826-a378-0de16f2b4e56-kube-api-access-m9pgd\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935359 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935378 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935397 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-serving-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935413 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-etcd-client\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935435 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnsxm\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-kube-api-access-gnsxm\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935455 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-config\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935473 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-images\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935503 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-client\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935521 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935536 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935558 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f0fa93-740f-43aa-9350-24d9920a9345-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935578 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935599 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgt9z\" (UniqueName: \"kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935622 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-oauth-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935640 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935664 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935686 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-service-ca\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935704 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935724 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8kss\" (UniqueName: \"kubernetes.io/projected/643578b4-75ca-4765-8df5-9167688e3ced-kube-api-access-x8kss\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935744 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7pk\" (UniqueName: \"kubernetes.io/projected/155619c1-12ba-4149-9dce-474e3735168c-kube-api-access-gt7pk\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935764 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d469ce1-e7ed-4826-a378-0de16f2b4e56-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935783 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935802 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935824 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-image-import-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935851 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-encryption-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935879 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-config\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935905 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935921 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935943 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935971 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-trusted-ca-bundle\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.935997 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/839b6744-bbe6-4b56-b020-181d86c604fe-machine-approver-tls\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936016 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8zlc\" (UniqueName: \"kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936035 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936061 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936092 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936117 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936157 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936194 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936233 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936298 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsnph\" (UniqueName: \"kubernetes.io/projected/0522c983-dae6-41ca-807a-ff45912a0024-kube-api-access-fsnph\") pod \"downloads-7954f5f757-rt865\" (UID: \"0522c983-dae6-41ca-807a-ff45912a0024\") " pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936847 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws66n\" (UniqueName: \"kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936895 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.936921 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49kkg\" (UniqueName: \"kubernetes.io/projected/9c752f56-7754-4718-aea5-cb41d6ac4253-kube-api-access-49kkg\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937581 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-serving-cert\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937691 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937752 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-etcd-client\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937787 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-encryption-config\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937852 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.937976 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-config\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938008 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pgvc\" (UniqueName: \"kubernetes.io/projected/4360bf41-9e45-498e-8f94-2c43a0dc88e5-kube-api-access-8pgvc\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938136 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f00b345-a265-41cc-89b7-6f059fc4d5d1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938208 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad62ba3d-c60a-4e1f-9768-187e74151f24-serving-cert\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938239 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938294 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit-dir\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938440 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbqwz\" (UniqueName: \"kubernetes.io/projected/1b1913d4-85d3-4596-acea-6e272cf81e8e-kube-api-access-cbqwz\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938440 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938484 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-console-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938541 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-audit-policies\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938598 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/643578b4-75ca-4765-8df5-9167688e3ced-audit-dir\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938659 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4ssg\" (UniqueName: \"kubernetes.io/projected/5f00b345-a265-41cc-89b7-6f059fc4d5d1-kube-api-access-f4ssg\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938696 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp6bb\" (UniqueName: \"kubernetes.io/projected/839b6744-bbe6-4b56-b020-181d86c604fe-kube-api-access-pp6bb\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938737 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-node-pullsecrets\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938777 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.938853 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-trusted-ca\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939024 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4360bf41-9e45-498e-8f94-2c43a0dc88e5-serving-cert\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939191 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-serving-cert\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939256 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939341 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939367 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjvm\" (UniqueName: \"kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939464 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939474 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-serving-cert\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939515 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939708 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw65g\" (UniqueName: \"kubernetes.io/projected/ad62ba3d-c60a-4e1f-9768-187e74151f24-kube-api-access-vw65g\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939772 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939832 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9z7g\" (UniqueName: \"kubernetes.io/projected/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-kube-api-access-x9z7g\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939874 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4kmn\" (UniqueName: \"kubernetes.io/projected/5526b957-e33f-4952-8dda-d2875c94686a-kube-api-access-w4kmn\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939912 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f0fa93-740f-43aa-9350-24d9920a9345-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.939935 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-service-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.940026 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ad62ba3d-c60a-4e1f-9768-187e74151f24-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.940064 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.940335 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.942872 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rt865"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.942926 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5jsj6"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.945249 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.945276 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.945620 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.947341 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.949039 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.952643 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-72t6m"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.953438 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.953656 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.954008 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.955299 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.955432 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-kt8q6"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.956498 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.959573 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.961155 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.962044 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.966580 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.968382 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.968702 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.969366 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.971101 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.973916 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-49n75"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.976214 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.978453 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.979843 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29521440-k6f7k"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.981619 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.984244 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-flvjk"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.986375 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-5bsn7"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.986964 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.987996 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t5827"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.994216 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-frmbv"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.995947 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5jsj6"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.996248 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.996876 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.998295 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv"] Feb 17 00:08:01 crc kubenswrapper[4791]: I0217 00:08:01.999734 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.000853 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.002088 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.003109 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.006223 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.006510 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.019420 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tlpgd"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.020473 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4chtt"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.020542 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.021281 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.021517 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stqb9"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.022526 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.023542 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.024559 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.026064 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.027288 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7pqm8"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.028402 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.029428 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.030491 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.031577 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.032696 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.034312 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.035990 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-72t6m"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.036308 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.037489 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5nwz7"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.038765 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4chtt"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.039815 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tlpgd"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041038 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041368 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041496 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-etcd-client\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041616 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-config\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041727 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pgvc\" (UniqueName: \"kubernetes.io/projected/4360bf41-9e45-498e-8f94-2c43a0dc88e5-kube-api-access-8pgvc\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.041852 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3032312-913c-4072-ac18-56fdc689cbac-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.042020 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.042956 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043256 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit-dir\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043339 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbqwz\" (UniqueName: \"kubernetes.io/projected/1b1913d4-85d3-4596-acea-6e272cf81e8e-kube-api-access-cbqwz\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043374 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/643578b4-75ca-4765-8df5-9167688e3ced-audit-dir\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043386 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit-dir\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043395 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp6bb\" (UniqueName: \"kubernetes.io/projected/839b6744-bbe6-4b56-b020-181d86c604fe-kube-api-access-pp6bb\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043477 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b747aa6-3874-4f71-86bb-d340398d7bc4-config\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043518 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4ssg\" (UniqueName: \"kubernetes.io/projected/5f00b345-a265-41cc-89b7-6f059fc4d5d1-kube-api-access-f4ssg\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043554 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043577 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-serving-cert\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043277 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-config\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043717 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/643578b4-75ca-4765-8df5-9167688e3ced-audit-dir\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043839 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/03d7a8df-a8a3-4b34-bd28-d554ae70875a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043908 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-serving-cert\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.043968 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044003 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4kmn\" (UniqueName: \"kubernetes.io/projected/5526b957-e33f-4952-8dda-d2875c94686a-kube-api-access-w4kmn\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044035 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f0fa93-740f-43aa-9350-24d9920a9345-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044080 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b747aa6-3874-4f71-86bb-d340398d7bc4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044173 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ad62ba3d-c60a-4e1f-9768-187e74151f24-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044215 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-service-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044250 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g6bf\" (UniqueName: \"kubernetes.io/projected/e3032312-913c-4072-ac18-56fdc689cbac-kube-api-access-8g6bf\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044293 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044330 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-config\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044368 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044437 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c752f56-7754-4718-aea5-cb41d6ac4253-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044467 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-serving-cert\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044499 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044536 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044600 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044656 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbb6h\" (UniqueName: \"kubernetes.io/projected/03d7a8df-a8a3-4b34-bd28-d554ae70875a-kube-api-access-zbb6h\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044737 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b747aa6-3874-4f71-86bb-d340398d7bc4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044762 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv2ts\" (UniqueName: \"kubernetes.io/projected/fe44c059-87ef-4805-b78f-b8c3cdfd844e-kube-api-access-rv2ts\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044805 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jpw\" (UniqueName: \"kubernetes.io/projected/9578978b-522d-48d8-9b08-384752fc49a1-kube-api-access-k2jpw\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044850 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-config\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044890 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044910 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044950 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgt9z\" (UniqueName: \"kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.044975 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-oauth-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045049 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b6c19ecc-0208-46de-8c03-6780bba30353-tmpfs\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045078 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d469ce1-e7ed-4826-a378-0de16f2b4e56-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045100 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045124 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7pk\" (UniqueName: \"kubernetes.io/projected/155619c1-12ba-4149-9dce-474e3735168c-kube-api-access-gt7pk\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045158 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045234 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-apiservice-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045304 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045327 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-image-import-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045395 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045416 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045487 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3032312-913c-4072-ac18-56fdc689cbac-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045526 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4qdh\" (UniqueName: \"kubernetes.io/projected/71967495-8841-4810-89e5-e114b9887c5e-kube-api-access-l4qdh\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045549 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9578978b-522d-48d8-9b08-384752fc49a1-signing-key\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045571 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-encryption-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045594 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-config\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045623 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-trusted-ca-bundle\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045644 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/839b6744-bbe6-4b56-b020-181d86c604fe-machine-approver-tls\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045669 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8zlc\" (UniqueName: \"kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045686 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045740 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045775 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045800 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.045989 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qrvj\" (UniqueName: \"kubernetes.io/projected/459f3992-b770-44d7-9ecc-0ae8a228134f-kube-api-access-2qrvj\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046097 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-serving-cert\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046182 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-encryption-config\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046244 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svp66\" (UniqueName: \"kubernetes.io/projected/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-kube-api-access-svp66\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046274 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f00b345-a265-41cc-89b7-6f059fc4d5d1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046324 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad62ba3d-c60a-4e1f-9768-187e74151f24-serving-cert\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046389 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046426 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-webhook-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046466 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-console-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046487 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-audit-policies\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046511 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046537 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9578978b-522d-48d8-9b08-384752fc49a1-signing-cabundle\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046612 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-trusted-ca\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046641 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046702 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046725 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-node-pullsecrets\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046923 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046945 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046971 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjvm\" (UniqueName: \"kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.046995 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4360bf41-9e45-498e-8f94-2c43a0dc88e5-serving-cert\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047058 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw65g\" (UniqueName: \"kubernetes.io/projected/ad62ba3d-c60a-4e1f-9768-187e74151f24-kube-api-access-vw65g\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047077 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047175 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9z7g\" (UniqueName: \"kubernetes.io/projected/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-kube-api-access-x9z7g\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047214 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047246 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047281 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d469ce1-e7ed-4826-a378-0de16f2b4e56-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047314 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-oauth-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047347 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-auth-proxy-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047380 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047382 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-serving-cert\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047408 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqmg\" (UniqueName: \"kubernetes.io/projected/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-kube-api-access-wvqmg\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047457 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9pgd\" (UniqueName: \"kubernetes.io/projected/3d469ce1-e7ed-4826-a378-0de16f2b4e56-kube-api-access-m9pgd\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047486 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-966rv\" (UniqueName: \"kubernetes.io/projected/b6c19ecc-0208-46de-8c03-6780bba30353-kube-api-access-966rv\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047513 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047540 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-serving-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047571 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-etcd-client\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047608 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-images\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047642 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnsxm\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-kube-api-access-gnsxm\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047670 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5ph6\" (UniqueName: \"kubernetes.io/projected/c5fb65f7-7cc6-4834-853e-a91eebc956fd-kube-api-access-r5ph6\") pod \"migrator-59844c95c7-952sm\" (UID: \"c5fb65f7-7cc6-4834-853e-a91eebc956fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047700 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-client\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047735 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-images\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047754 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047768 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f0fa93-740f-43aa-9350-24d9920a9345-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047806 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047849 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047876 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047909 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047930 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-service-ca\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047952 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8kss\" (UniqueName: \"kubernetes.io/projected/643578b4-75ca-4765-8df5-9167688e3ced-kube-api-access-x8kss\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047970 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-metrics-tls\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.047993 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048046 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe44c059-87ef-4805-b78f-b8c3cdfd844e-proxy-tls\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048072 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048097 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048121 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048158 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048182 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsnph\" (UniqueName: \"kubernetes.io/projected/0522c983-dae6-41ca-807a-ff45912a0024-kube-api-access-fsnph\") pod \"downloads-7954f5f757-rt865\" (UID: \"0522c983-dae6-41ca-807a-ff45912a0024\") " pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048215 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws66n\" (UniqueName: \"kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048249 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048271 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49kkg\" (UniqueName: \"kubernetes.io/projected/9c752f56-7754-4718-aea5-cb41d6ac4253-kube-api-access-49kkg\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048294 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/71967495-8841-4810-89e5-e114b9887c5e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048470 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.048603 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.049165 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ad62ba3d-c60a-4e1f-9768-187e74151f24-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.049230 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-service-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.049598 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.049647 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.050554 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-serving-cert\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.050623 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.052459 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.053353 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-config\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.053809 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5526b957-e33f-4952-8dda-d2875c94686a-etcd-ca\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.055504 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.055654 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.055957 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-946wq"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.056365 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-config\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.056839 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.057302 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.057561 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.058620 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.058796 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-946wq"] Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.058795 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9c752f56-7754-4718-aea5-cb41d6ac4253-images\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.058908 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.059161 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-etcd-client\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.059700 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-config\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060368 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-encryption-config\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060386 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-image-import-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060524 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-serving-ca\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060591 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060561 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-serving-cert\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.060676 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1b1913d4-85d3-4596-acea-6e272cf81e8e-node-pullsecrets\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.061067 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/643578b4-75ca-4765-8df5-9167688e3ced-serving-cert\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.061301 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-oauth-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.061460 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f0fa93-740f-43aa-9350-24d9920a9345-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.061536 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f00b345-a265-41cc-89b7-6f059fc4d5d1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.062055 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.062214 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.062333 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-encryption-config\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.062730 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.062940 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d469ce1-e7ed-4826-a378-0de16f2b4e56-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063169 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-console-config\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063288 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063322 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063430 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-trusted-ca-bundle\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063648 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c752f56-7754-4718-aea5-cb41d6ac4253-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.063734 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/643578b4-75ca-4765-8df5-9167688e3ced-audit-policies\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.064486 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/839b6744-bbe6-4b56-b020-181d86c604fe-auth-proxy-config\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.064629 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d469ce1-e7ed-4826-a378-0de16f2b4e56-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.064627 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b1913d4-85d3-4596-acea-6e272cf81e8e-etcd-client\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.064836 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-oauth-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.064837 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.065051 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.065672 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5526b957-e33f-4952-8dda-d2875c94686a-etcd-client\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.066078 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1b1913d4-85d3-4596-acea-6e272cf81e8e-audit\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.066111 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/155619c1-12ba-4149-9dce-474e3735168c-service-ca\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.066383 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.066470 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.066484 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f0fa93-740f-43aa-9350-24d9920a9345-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.067022 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.067439 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/839b6744-bbe6-4b56-b020-181d86c604fe-machine-approver-tls\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.067471 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.067704 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4360bf41-9e45-498e-8f94-2c43a0dc88e5-trusted-ca\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.067986 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.068358 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.068548 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4360bf41-9e45-498e-8f94-2c43a0dc88e5-serving-cert\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.069684 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.069886 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.070381 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/155619c1-12ba-4149-9dce-474e3735168c-console-serving-cert\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.070462 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.070558 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.072680 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad62ba3d-c60a-4e1f-9768-187e74151f24-serving-cert\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.077104 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.096201 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.116364 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.136384 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149190 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/03d7a8df-a8a3-4b34-bd28-d554ae70875a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149228 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b747aa6-3874-4f71-86bb-d340398d7bc4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149249 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g6bf\" (UniqueName: \"kubernetes.io/projected/e3032312-913c-4072-ac18-56fdc689cbac-kube-api-access-8g6bf\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149280 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149306 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149335 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbb6h\" (UniqueName: \"kubernetes.io/projected/03d7a8df-a8a3-4b34-bd28-d554ae70875a-kube-api-access-zbb6h\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149377 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b747aa6-3874-4f71-86bb-d340398d7bc4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149399 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv2ts\" (UniqueName: \"kubernetes.io/projected/fe44c059-87ef-4805-b78f-b8c3cdfd844e-kube-api-access-rv2ts\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149421 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2jpw\" (UniqueName: \"kubernetes.io/projected/9578978b-522d-48d8-9b08-384752fc49a1-kube-api-access-k2jpw\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149479 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b6c19ecc-0208-46de-8c03-6780bba30353-tmpfs\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149511 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-apiservice-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149553 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149571 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3032312-913c-4072-ac18-56fdc689cbac-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149630 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4qdh\" (UniqueName: \"kubernetes.io/projected/71967495-8841-4810-89e5-e114b9887c5e-kube-api-access-l4qdh\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149699 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9578978b-522d-48d8-9b08-384752fc49a1-signing-key\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149737 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qrvj\" (UniqueName: \"kubernetes.io/projected/459f3992-b770-44d7-9ecc-0ae8a228134f-kube-api-access-2qrvj\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149788 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svp66\" (UniqueName: \"kubernetes.io/projected/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-kube-api-access-svp66\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149811 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-webhook-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149827 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.149981 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9578978b-522d-48d8-9b08-384752fc49a1-signing-cabundle\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150059 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150090 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150191 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150271 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqmg\" (UniqueName: \"kubernetes.io/projected/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-kube-api-access-wvqmg\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150332 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-966rv\" (UniqueName: \"kubernetes.io/projected/b6c19ecc-0208-46de-8c03-6780bba30353-kube-api-access-966rv\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150356 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150412 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-images\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150445 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5ph6\" (UniqueName: \"kubernetes.io/projected/c5fb65f7-7cc6-4834-853e-a91eebc956fd-kube-api-access-r5ph6\") pod \"migrator-59844c95c7-952sm\" (UID: \"c5fb65f7-7cc6-4834-853e-a91eebc956fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150503 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150541 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-metrics-tls\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150568 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe44c059-87ef-4805-b78f-b8c3cdfd844e-proxy-tls\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150648 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/71967495-8841-4810-89e5-e114b9887c5e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150694 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3032312-913c-4072-ac18-56fdc689cbac-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150749 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b747aa6-3874-4f71-86bb-d340398d7bc4-config\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.150926 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.151016 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b6c19ecc-0208-46de-8c03-6780bba30353-tmpfs\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.163785 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.177527 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.197382 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.216349 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.224005 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.239207 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.256663 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.260187 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.278489 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.297641 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.318896 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.338311 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.343359 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3032312-913c-4072-ac18-56fdc689cbac-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.356992 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.361880 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3032312-913c-4072-ac18-56fdc689cbac-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.377183 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.396360 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.418254 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.436942 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.456925 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.462450 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b747aa6-3874-4f71-86bb-d340398d7bc4-config\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.477928 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.497259 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.504776 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b747aa6-3874-4f71-86bb-d340398d7bc4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.516955 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.537966 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.557520 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.577399 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.598175 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.617508 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.636781 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.657739 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.664867 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/03d7a8df-a8a3-4b34-bd28-d554ae70875a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.677442 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.697666 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.706397 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.718164 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.736647 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.757243 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.777456 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.797079 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.817627 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.837608 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.856733 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.865502 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-webhook-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.865707 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6c19ecc-0208-46de-8c03-6780bba30353-apiservice-cert\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.877876 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.896753 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.904061 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe44c059-87ef-4805-b78f-b8c3cdfd844e-images\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.917481 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.925359 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9578978b-522d-48d8-9b08-384752fc49a1-signing-key\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.936986 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.955814 4791 request.go:700] Waited for 1.009901131s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dsigning-cabundle&limit=500&resourceVersion=0 Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.957937 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.962399 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9578978b-522d-48d8-9b08-384752fc49a1-signing-cabundle\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.977742 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 00:08:02 crc kubenswrapper[4791]: I0217 00:08:02.998365 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.017921 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.025989 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe44c059-87ef-4805-b78f-b8c3cdfd844e-proxy-tls\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.037669 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.057523 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.077563 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.096448 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.107117 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/71967495-8841-4810-89e5-e114b9887c5e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.117619 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.126264 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-metrics-tls\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.136898 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.149914 4791 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.150045 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth podName:459f3992-b770-44d7-9ecc-0ae8a228134f nodeName:}" failed. No retries permitted until 2026-02-17 00:08:03.650017401 +0000 UTC m=+141.129530038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth") pod "router-default-5444994796-kt8q6" (UID: "459f3992-b770-44d7-9ecc-0ae8a228134f") : failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152658 4791 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152722 4791 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152733 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle podName:459f3992-b770-44d7-9ecc-0ae8a228134f nodeName:}" failed. No retries permitted until 2026-02-17 00:08:03.652716538 +0000 UTC m=+141.132229175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle") pod "router-default-5444994796-kt8q6" (UID: "459f3992-b770-44d7-9ecc-0ae8a228134f") : failed to sync configmap cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152797 4791 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152831 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate podName:459f3992-b770-44d7-9ecc-0ae8a228134f nodeName:}" failed. No retries permitted until 2026-02-17 00:08:03.652805601 +0000 UTC m=+141.132318158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate") pod "router-default-5444994796-kt8q6" (UID: "459f3992-b770-44d7-9ecc-0ae8a228134f") : failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: E0217 00:08:03.152855 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs podName:459f3992-b770-44d7-9ecc-0ae8a228134f nodeName:}" failed. No retries permitted until 2026-02-17 00:08:03.652841142 +0000 UTC m=+141.132353679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs") pod "router-default-5444994796-kt8q6" (UID: "459f3992-b770-44d7-9ecc-0ae8a228134f") : failed to sync secret cache: timed out waiting for the condition Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.157065 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.176887 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.196852 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.217055 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.236677 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.257125 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.276910 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.296246 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.316928 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.337921 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.356813 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.398518 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.417435 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.438408 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.457480 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.477089 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.498171 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.517388 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.537346 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.557536 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.589525 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.597672 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.617743 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.638508 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.657313 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.697961 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.703519 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.703683 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.703745 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.703801 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.705783 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/459f3992-b770-44d7-9ecc-0ae8a228134f-service-ca-bundle\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.707851 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-stats-auth\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.708490 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-metrics-certs\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.711008 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/459f3992-b770-44d7-9ecc-0ae8a228134f-default-certificate\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.717301 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.737591 4791 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.756798 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.776616 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.797977 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.839387 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pgvc\" (UniqueName: \"kubernetes.io/projected/4360bf41-9e45-498e-8f94-2c43a0dc88e5-kube-api-access-8pgvc\") pod \"console-operator-58897d9998-t5827\" (UID: \"4360bf41-9e45-498e-8f94-2c43a0dc88e5\") " pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.854312 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbqwz\" (UniqueName: \"kubernetes.io/projected/1b1913d4-85d3-4596-acea-6e272cf81e8e-kube-api-access-cbqwz\") pod \"apiserver-76f77b778f-flvjk\" (UID: \"1b1913d4-85d3-4596-acea-6e272cf81e8e\") " pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.874915 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp6bb\" (UniqueName: \"kubernetes.io/projected/839b6744-bbe6-4b56-b020-181d86c604fe-kube-api-access-pp6bb\") pod \"machine-approver-56656f9798-d7qrh\" (UID: \"839b6744-bbe6-4b56-b020-181d86c604fe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.894607 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4ssg\" (UniqueName: \"kubernetes.io/projected/5f00b345-a265-41cc-89b7-6f059fc4d5d1-kube-api-access-f4ssg\") pod \"cluster-samples-operator-665b6dd947-qzgkr\" (UID: \"5f00b345-a265-41cc-89b7-6f059fc4d5d1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.916214 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4kmn\" (UniqueName: \"kubernetes.io/projected/5526b957-e33f-4952-8dda-d2875c94686a-kube-api-access-w4kmn\") pod \"etcd-operator-b45778765-stqb9\" (UID: \"5526b957-e33f-4952-8dda-d2875c94686a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.941531 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9z7g\" (UniqueName: \"kubernetes.io/projected/d15f89df-5eaf-48c8-963d-bc3e1c79bd43-kube-api-access-x9z7g\") pod \"authentication-operator-69f744f599-49n75\" (UID: \"d15f89df-5eaf-48c8-963d-bc3e1c79bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.950118 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.961000 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9pgd\" (UniqueName: \"kubernetes.io/projected/3d469ce1-e7ed-4826-a378-0de16f2b4e56-kube-api-access-m9pgd\") pod \"openshift-apiserver-operator-796bbdcf4f-8lhkg\" (UID: \"3d469ce1-e7ed-4826-a378-0de16f2b4e56\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.973306 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgt9z\" (UniqueName: \"kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z\") pod \"image-pruner-29521440-k6f7k\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.975788 4791 request.go:700] Waited for 1.917160841s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.992486 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7pk\" (UniqueName: \"kubernetes.io/projected/155619c1-12ba-4149-9dce-474e3735168c-kube-api-access-gt7pk\") pod \"console-f9d7485db-frmbv\" (UID: \"155619c1-12ba-4149-9dce-474e3735168c\") " pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:03 crc kubenswrapper[4791]: I0217 00:08:03.997406 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.018084 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.027043 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.040603 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.057312 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.067741 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.067752 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.076633 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.081855 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnsxm\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-kube-api-access-gnsxm\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.109032 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsnph\" (UniqueName: \"kubernetes.io/projected/0522c983-dae6-41ca-807a-ff45912a0024-kube-api-access-fsnph\") pod \"downloads-7954f5f757-rt865\" (UID: \"0522c983-dae6-41ca-807a-ff45912a0024\") " pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.117357 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws66n\" (UniqueName: \"kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n\") pod \"oauth-openshift-558db77b4-r8zpf\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.121271 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.129216 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.138427 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.140486 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8zlc\" (UniqueName: \"kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc\") pod \"controller-manager-879f6c89f-jftdn\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.153716 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8kss\" (UniqueName: \"kubernetes.io/projected/643578b4-75ca-4765-8df5-9167688e3ced-kube-api-access-x8kss\") pod \"apiserver-7bbb656c7d-sgzjl\" (UID: \"643578b4-75ca-4765-8df5-9167688e3ced\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.156980 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.194652 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.202489 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw65g\" (UniqueName: \"kubernetes.io/projected/ad62ba3d-c60a-4e1f-9768-187e74151f24-kube-api-access-vw65g\") pod \"openshift-config-operator-7777fb866f-qxk8k\" (UID: \"ad62ba3d-c60a-4e1f-9768-187e74151f24\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.212439 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7f0fa93-740f-43aa-9350-24d9920a9345-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fpr65\" (UID: \"b7f0fa93-740f-43aa-9350-24d9920a9345\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.233554 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.234072 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjvm\" (UniqueName: \"kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm\") pod \"route-controller-manager-6576b87f9c-ht455\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.253576 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49kkg\" (UniqueName: \"kubernetes.io/projected/9c752f56-7754-4718-aea5-cb41d6ac4253-kube-api-access-49kkg\") pod \"machine-api-operator-5694c8668f-5nwz7\" (UID: \"9c752f56-7754-4718-aea5-cb41d6ac4253\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.271801 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.272550 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b747aa6-3874-4f71-86bb-d340398d7bc4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8nb5r\" (UID: \"0b747aa6-3874-4f71-86bb-d340398d7bc4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.284456 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29521440-k6f7k"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.291320 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbb6h\" (UniqueName: \"kubernetes.io/projected/03d7a8df-a8a3-4b34-bd28-d554ae70875a-kube-api-access-zbb6h\") pod \"control-plane-machine-set-operator-78cbb6b69f-kqr99\" (UID: \"03d7a8df-a8a3-4b34-bd28-d554ae70875a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.309868 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.324391 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e2cf0a7-86f6-4858-98ad-08c4c644deb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fpv5b\" (UID: \"5e2cf0a7-86f6-4858-98ad-08c4c644deb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.332066 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g6bf\" (UniqueName: \"kubernetes.io/projected/e3032312-913c-4072-ac18-56fdc689cbac-kube-api-access-8g6bf\") pod \"kube-storage-version-migrator-operator-b67b599dd-fs52s\" (UID: \"e3032312-913c-4072-ac18-56fdc689cbac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:04 crc kubenswrapper[4791]: W0217 00:08:04.334717 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94401a93_55c7_4e8b_83f7_dc27a876f335.slice/crio-142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2 WatchSource:0}: Error finding container 142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2: Status 404 returned error can't find the container with id 142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2 Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.363773 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv2ts\" (UniqueName: \"kubernetes.io/projected/fe44c059-87ef-4805-b78f-b8c3cdfd844e-kube-api-access-rv2ts\") pod \"machine-config-operator-74547568cd-n2s28\" (UID: \"fe44c059-87ef-4805-b78f-b8c3cdfd844e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.364280 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.376357 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2jpw\" (UniqueName: \"kubernetes.io/projected/9578978b-522d-48d8-9b08-384752fc49a1-kube-api-access-k2jpw\") pod \"service-ca-9c57cc56f-5jsj6\" (UID: \"9578978b-522d-48d8-9b08-384752fc49a1\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.382251 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.388395 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.392466 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4qdh\" (UniqueName: \"kubernetes.io/projected/71967495-8841-4810-89e5-e114b9887c5e-kube-api-access-l4qdh\") pod \"package-server-manager-789f6589d5-r2rv7\" (UID: \"71967495-8841-4810-89e5-e114b9887c5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.397563 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.404903 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.409062 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qrvj\" (UniqueName: \"kubernetes.io/projected/459f3992-b770-44d7-9ecc-0ae8a228134f-kube-api-access-2qrvj\") pod \"router-default-5444994796-kt8q6\" (UID: \"459f3992-b770-44d7-9ecc-0ae8a228134f\") " pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.413023 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.430183 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svp66\" (UniqueName: \"kubernetes.io/projected/c50f96b4-3a86-4edc-b9d5-82fe3181b8a4-kube-api-access-svp66\") pod \"multus-admission-controller-857f4d67dd-7pqm8\" (UID: \"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.457849 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-966rv\" (UniqueName: \"kubernetes.io/projected/b6c19ecc-0208-46de-8c03-6780bba30353-kube-api-access-966rv\") pod \"packageserver-d55dfcdfc-bt9mf\" (UID: \"b6c19ecc-0208-46de-8c03-6780bba30353\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.476772 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqmg\" (UniqueName: \"kubernetes.io/projected/ee8894f1-34ac-4df2-bf1a-01e1a110d6c9-kube-api-access-wvqmg\") pod \"dns-operator-744455d44c-72t6m\" (UID: \"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9\") " pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.492786 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5ph6\" (UniqueName: \"kubernetes.io/projected/c5fb65f7-7cc6-4834-853e-a91eebc956fd-kube-api-access-r5ph6\") pod \"migrator-59844c95c7-952sm\" (UID: \"c5fb65f7-7cc6-4834-853e-a91eebc956fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.511739 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525353 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525408 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9kxv\" (UniqueName: \"kubernetes.io/projected/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-kube-api-access-l9kxv\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525444 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525514 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/afb516ca-988f-4b77-aea0-10cd22ce2b77-metrics-tls\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525540 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525589 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525652 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525675 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-config\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525706 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525739 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525761 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525785 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525807 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8htvc\" (UniqueName: \"kubernetes.io/projected/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-kube-api-access-8htvc\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525845 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525906 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kkpm\" (UniqueName: \"kubernetes.io/projected/57ed01e7-bfeb-428e-88c2-371662581ddf-kube-api-access-6kkpm\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525930 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525967 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525992 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptqw5\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526010 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-proxy-tls\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526033 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvk22\" (UniqueName: \"kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526082 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed01e7-bfeb-428e-88c2-371662581ddf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526182 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526206 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed01e7-bfeb-428e-88c2-371662581ddf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526230 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afb516ca-988f-4b77-aea0-10cd22ce2b77-trusted-ca\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526359 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hr4t\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-kube-api-access-8hr4t\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526431 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.526480 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-srv-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.527809 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.02779327 +0000 UTC m=+142.507305887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.525034 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.535454 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.552067 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.559041 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.559170 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-flvjk"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.565958 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-49n75"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.584975 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.594090 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.604752 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" Feb 17 00:08:04 crc kubenswrapper[4791]: W0217 00:08:04.607672 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1913d4_85d3_4596_acea_6e272cf81e8e.slice/crio-dd92dbba7d84f967b1d4396b47a319688a637672b5b28895be564fb0425a3769 WatchSource:0}: Error finding container dd92dbba7d84f967b1d4396b47a319688a637672b5b28895be564fb0425a3769: Status 404 returned error can't find the container with id dd92dbba7d84f967b1d4396b47a319688a637672b5b28895be564fb0425a3769 Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.614479 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627034 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627263 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed01e7-bfeb-428e-88c2-371662581ddf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627289 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-metrics-tls\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627308 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-profile-collector-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627325 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627340 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed01e7-bfeb-428e-88c2-371662581ddf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.627358 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-plugins-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.627677 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.12762378 +0000 UTC m=+142.607136307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.628357 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ed01e7-bfeb-428e-88c2-371662581ddf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.628418 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afb516ca-988f-4b77-aea0-10cd22ce2b77-trusted-ca\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.628443 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkz8g\" (UniqueName: \"kubernetes.io/projected/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-kube-api-access-kkz8g\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.629354 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afb516ca-988f-4b77-aea0-10cd22ce2b77-trusted-ca\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.629703 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hr4t\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-kube-api-access-8hr4t\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.629793 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.629814 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-certs\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.630608 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-srv-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.630624 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.130610436 +0000 UTC m=+142.610122963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649710 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-srv-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649794 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-cert\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649815 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-registration-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649845 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649888 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-socket-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649914 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l6xj\" (UniqueName: \"kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649939 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9kxv\" (UniqueName: \"kubernetes.io/projected/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-kube-api-access-l9kxv\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.649964 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46st9\" (UniqueName: \"kubernetes.io/projected/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-kube-api-access-46st9\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650062 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650188 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8af772-70a9-4758-b597-363c1db463ad-config\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650226 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/afb516ca-988f-4b77-aea0-10cd22ce2b77-metrics-tls\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650294 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650318 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5vvm\" (UniqueName: \"kubernetes.io/projected/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-kube-api-access-n5vvm\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650339 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-csi-data-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.650381 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651273 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651352 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651517 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-mountpoint-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651601 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651624 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-config\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.651649 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-config-volume\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653007 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653042 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fz5j\" (UniqueName: \"kubernetes.io/projected/ae8af772-70a9-4758-b597-363c1db463ad-kube-api-access-7fz5j\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653386 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653419 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653444 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653466 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653491 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8htvc\" (UniqueName: \"kubernetes.io/projected/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-kube-api-access-8htvc\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.653513 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8af772-70a9-4758-b597-363c1db463ad-serving-cert\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.655090 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57ed01e7-bfeb-428e-88c2-371662581ddf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.655911 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.656467 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-config\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.659602 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-frmbv"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.660090 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.661092 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.662740 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.667099 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg877\" (UniqueName: \"kubernetes.io/projected/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-kube-api-access-lg877\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.670465 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.671376 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-node-bootstrap-token\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.671911 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.672165 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kkpm\" (UniqueName: \"kubernetes.io/projected/57ed01e7-bfeb-428e-88c2-371662581ddf-kube-api-access-6kkpm\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.673001 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.673238 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.675811 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.676275 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.677003 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptqw5\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.677044 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-proxy-tls\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.677522 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvk22\" (UniqueName: \"kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.678212 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcqqr\" (UniqueName: \"kubernetes.io/projected/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-kube-api-access-fcqqr\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.678990 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/afb516ca-988f-4b77-aea0-10cd22ce2b77-metrics-tls\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.681202 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.684424 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.685723 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.686362 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-proxy-tls\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.687546 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.689190 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-srv-cert\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.712063 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.713234 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hr4t\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-kube-api-access-8hr4t\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.713331 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.720405 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8543e6a7-7bb0-4a35-96c5-bcae0763cc78-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-57qch\" (UID: \"8543e6a7-7bb0-4a35-96c5-bcae0763cc78\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.731740 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8htvc\" (UniqueName: \"kubernetes.io/projected/7bf3a6ce-9a2b-49cc-9360-f552988f2b38-kube-api-access-8htvc\") pod \"olm-operator-6b444d44fb-dhmqq\" (UID: \"7bf3a6ce-9a2b-49cc-9360-f552988f2b38\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.771440 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779260 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779503 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkz8g\" (UniqueName: \"kubernetes.io/projected/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-kube-api-access-kkz8g\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779545 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-certs\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779570 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-srv-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779588 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-cert\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779607 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-registration-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779634 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-socket-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779659 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l6xj\" (UniqueName: \"kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779677 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46st9\" (UniqueName: \"kubernetes.io/projected/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-kube-api-access-46st9\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779709 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8af772-70a9-4758-b597-363c1db463ad-config\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779725 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5vvm\" (UniqueName: \"kubernetes.io/projected/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-kube-api-access-n5vvm\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779748 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-csi-data-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779767 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779782 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-mountpoint-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779800 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-config-volume\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779829 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fz5j\" (UniqueName: \"kubernetes.io/projected/ae8af772-70a9-4758-b597-363c1db463ad-kube-api-access-7fz5j\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779863 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779882 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8af772-70a9-4758-b597-363c1db463ad-serving-cert\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779906 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg877\" (UniqueName: \"kubernetes.io/projected/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-kube-api-access-lg877\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779923 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-node-bootstrap-token\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779979 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcqqr\" (UniqueName: \"kubernetes.io/projected/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-kube-api-access-fcqqr\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.779999 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-profile-collector-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.780019 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-metrics-tls\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.780036 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-plugins-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.780310 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-plugins-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.780382 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.280367261 +0000 UTC m=+142.759879788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.782685 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-mountpoint-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.783266 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-certs\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.783317 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-config-volume\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.784629 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-csi-data-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.785678 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-registration-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.788685 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8af772-70a9-4758-b597-363c1db463ad-config\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.789277 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9kxv\" (UniqueName: \"kubernetes.io/projected/3baa46c8-1f0b-4b6f-95bb-94368bf6cc23-kube-api-access-l9kxv\") pod \"machine-config-controller-84d6567774-vb8fv\" (UID: \"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.789401 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-socket-dir\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.792086 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-cert\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.792567 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-srv-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.792921 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8af772-70a9-4758-b597-363c1db463ad-serving-cert\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.793584 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.794683 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.795956 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-node-bootstrap-token\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.798883 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-metrics-tls\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.801524 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afb516ca-988f-4b77-aea0-10cd22ce2b77-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j7w5g\" (UID: \"afb516ca-988f-4b77-aea0-10cd22ce2b77\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.803515 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-profile-collector-cert\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.805644 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.813643 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kkpm\" (UniqueName: \"kubernetes.io/projected/57ed01e7-bfeb-428e-88c2-371662581ddf-kube-api-access-6kkpm\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nlbr\" (UID: \"57ed01e7-bfeb-428e-88c2-371662581ddf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.830553 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.840875 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptqw5\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.841063 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.841250 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.843373 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t5827"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.843529 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-stqb9"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.855480 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvk22\" (UniqueName: \"kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22\") pod \"collect-profiles-29521440-crl2x\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: W0217 00:08:04.864077 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod459f3992_b770_44d7_9ecc_0ae8a228134f.slice/crio-5b13b672e6e00e92f256e5125759e26cc636b93b2b1c7ddba5a5960ceca650a0 WatchSource:0}: Error finding container 5b13b672e6e00e92f256e5125759e26cc636b93b2b1c7ddba5a5960ceca650a0: Status 404 returned error can't find the container with id 5b13b672e6e00e92f256e5125759e26cc636b93b2b1c7ddba5a5960ceca650a0 Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.866005 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.877702 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.881701 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.883784 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.383761226 +0000 UTC m=+142.863273823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.918733 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkz8g\" (UniqueName: \"kubernetes.io/projected/1ce1b285-b6aa-4361-aa9e-5274a9863b6a-kube-api-access-kkz8g\") pod \"ingress-canary-946wq\" (UID: \"1ce1b285-b6aa-4361-aa9e-5274a9863b6a\") " pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.926013 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fz5j\" (UniqueName: \"kubernetes.io/projected/ae8af772-70a9-4758-b597-363c1db463ad-kube-api-access-7fz5j\") pod \"service-ca-operator-777779d784-xc8t7\" (UID: \"ae8af772-70a9-4758-b597-363c1db463ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.935695 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l6xj\" (UniqueName: \"kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj\") pod \"marketplace-operator-79b997595-bfffb\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.957549 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5vvm\" (UniqueName: \"kubernetes.io/projected/811e9e22-1241-440a-9a6a-a6c51a0f0f7c-kube-api-access-n5vvm\") pod \"csi-hostpathplugin-tlpgd\" (UID: \"811e9e22-1241-440a-9a6a-a6c51a0f0f7c\") " pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.959790 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.960088 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.972422 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.982219 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rt865"] Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.983549 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.983696 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.483670338 +0000 UTC m=+142.963182865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.983883 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:04 crc kubenswrapper[4791]: E0217 00:08:04.984327 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.484315609 +0000 UTC m=+142.963828136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:04 crc kubenswrapper[4791]: I0217 00:08:04.987429 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46st9\" (UniqueName: \"kubernetes.io/projected/eeffaf81-97bf-4570-b2f4-4692c4bda9ac-kube-api-access-46st9\") pod \"dns-default-4chtt\" (UID: \"eeffaf81-97bf-4570-b2f4-4692c4bda9ac\") " pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.003375 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcqqr\" (UniqueName: \"kubernetes.io/projected/1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101-kube-api-access-fcqqr\") pod \"catalog-operator-68c6474976-hb9dg\" (UID: \"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.004031 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.008823 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.013230 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg877\" (UniqueName: \"kubernetes.io/projected/d2cd3da0-c3e4-461c-93ed-064c8b3b0edd-kube-api-access-lg877\") pod \"machine-config-server-5bsn7\" (UID: \"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd\") " pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.015493 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.024577 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5bsn7" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.049421 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.055164 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.064556 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-946wq" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.090844 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.090945 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.091660 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.591631959 +0000 UTC m=+143.071144486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.105474 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" event={"ID":"713c3460-f77d-4f7b-81bf-911f8f875dfe","Type":"ContainerStarted","Data":"f303e766e2c893e279f8d6e69a4b7c3a7060f8cee57e4d09d9bd789c6c1a5750"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.112047 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-t5827" event={"ID":"4360bf41-9e45-498e-8f94-2c43a0dc88e5","Type":"ContainerStarted","Data":"206b88e39f673e89ccffe9a9ef469f983b44a5b7b4d3a2eee4baa61d54a10cbe"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.116731 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5nwz7"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.119777 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.128416 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29521440-k6f7k" event={"ID":"94401a93-55c7-4e8b-83f7-dc27a876f335","Type":"ContainerStarted","Data":"1bf210069f01dcf3433075dd8a895405951d971de359016b4eb9aa868416c26a"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.128536 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29521440-k6f7k" event={"ID":"94401a93-55c7-4e8b-83f7-dc27a876f335","Type":"ContainerStarted","Data":"142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.131019 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kt8q6" event={"ID":"459f3992-b770-44d7-9ecc-0ae8a228134f","Type":"ContainerStarted","Data":"5b13b672e6e00e92f256e5125759e26cc636b93b2b1c7ddba5a5960ceca650a0"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.132217 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" event={"ID":"d15f89df-5eaf-48c8-963d-bc3e1c79bd43","Type":"ContainerStarted","Data":"9d6d2db44cc9f435332a1537d8dcee4fb6d883a2f1ee4560335c6594d7b3e53f"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.132261 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" event={"ID":"d15f89df-5eaf-48c8-963d-bc3e1c79bd43","Type":"ContainerStarted","Data":"fa86f84d9b3c927c534e66a8a8c7f06efa756a93210c273eee1c1f9021a4f2c6"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.133612 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" event={"ID":"1b1913d4-85d3-4596-acea-6e272cf81e8e","Type":"ContainerStarted","Data":"dd92dbba7d84f967b1d4396b47a319688a637672b5b28895be564fb0425a3769"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.134443 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" event={"ID":"5f00b345-a265-41cc-89b7-6f059fc4d5d1","Type":"ContainerStarted","Data":"34bc710b070e56fbe324763a2cc565269ad14a49a6e9ee5df8ba450137c73325"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.134465 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" event={"ID":"5f00b345-a265-41cc-89b7-6f059fc4d5d1","Type":"ContainerStarted","Data":"8cc8f299454a551b55d13491c775c0a939240c5e1e34939961d1c7c555a7aeb8"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.135334 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" event={"ID":"5526b957-e33f-4952-8dda-d2875c94686a","Type":"ContainerStarted","Data":"a815de6019a698c20bed21b73fbb7569a29188f85fc57d492c2463fb16dd7366"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.136132 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-frmbv" event={"ID":"155619c1-12ba-4149-9dce-474e3735168c","Type":"ContainerStarted","Data":"d68e68a5c791b5025cc874dfe67df8e2c5a13f25ec30257d931f822404c8bd9c"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.136154 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-frmbv" event={"ID":"155619c1-12ba-4149-9dce-474e3735168c","Type":"ContainerStarted","Data":"17b20743ee4b09534e8c553848de6d4017caec94554d3343f6ae0a7bdee43964"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.141953 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" event={"ID":"3d469ce1-e7ed-4826-a378-0de16f2b4e56","Type":"ContainerStarted","Data":"a64135342bdd7e818a09fa15f182506bb1c697bcadc4e9fc443eca02139d19bf"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.143779 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" event={"ID":"839b6744-bbe6-4b56-b020-181d86c604fe","Type":"ContainerStarted","Data":"27c89b131f2ada186ac01ff879401d8c7fcb538c8065adb433bed741f2f008f0"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.143803 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" event={"ID":"839b6744-bbe6-4b56-b020-181d86c604fe","Type":"ContainerStarted","Data":"482bbf8090489513d1c9878cc86b880c98f076c8e494308a40bd3a03e946699f"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.143813 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" event={"ID":"839b6744-bbe6-4b56-b020-181d86c604fe","Type":"ContainerStarted","Data":"c23842d5f2e3eef9553c2a7828b456f5c3333453e6613522a68d073ac3d56526"} Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.194516 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.195138 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.695091386 +0000 UTC m=+143.174603913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: W0217 00:08:05.221554 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod643578b4_75ca_4765_8df5_9167688e3ced.slice/crio-7bb16cbc0b60bc99bce5603cc5d99fefa64bde3c544cd8cd956d59cf9ea83790 WatchSource:0}: Error finding container 7bb16cbc0b60bc99bce5603cc5d99fefa64bde3c544cd8cd956d59cf9ea83790: Status 404 returned error can't find the container with id 7bb16cbc0b60bc99bce5603cc5d99fefa64bde3c544cd8cd956d59cf9ea83790 Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.254168 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.295059 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.297397 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.298569 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.798553282 +0000 UTC m=+143.278065809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.399311 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.400023 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:05.900007524 +0000 UTC m=+143.379520051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.500860 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.501465 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.001444995 +0000 UTC m=+143.480957522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.592384 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.597066 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.602247 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.602821 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.102808244 +0000 UTC m=+143.582320761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.608820 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.616245 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7pqm8"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.626018 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.627726 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5jsj6"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.698182 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf"] Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.702990 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.703253 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.203227393 +0000 UTC m=+143.682739920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.703396 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.703925 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.203907195 +0000 UTC m=+143.683419722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.705091 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7"] Feb 17 00:08:05 crc kubenswrapper[4791]: W0217 00:08:05.713663 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b747aa6_3874_4f71_86bb_d340398d7bc4.slice/crio-bf6c044a923c9a078ef93cd0ae0652e77b223f659eca082644bf22f9fba35dff WatchSource:0}: Error finding container bf6c044a923c9a078ef93cd0ae0652e77b223f659eca082644bf22f9fba35dff: Status 404 returned error can't find the container with id bf6c044a923c9a078ef93cd0ae0652e77b223f659eca082644bf22f9fba35dff Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.715249 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm"] Feb 17 00:08:05 crc kubenswrapper[4791]: W0217 00:08:05.799383 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71967495_8841_4810_89e5_e114b9887c5e.slice/crio-87a25ad62b20ec9733ec4fbfd91d80ed15aa968b40372fc7f787917720daec60 WatchSource:0}: Error finding container 87a25ad62b20ec9733ec4fbfd91d80ed15aa968b40372fc7f787917720daec60: Status 404 returned error can't find the container with id 87a25ad62b20ec9733ec4fbfd91d80ed15aa968b40372fc7f787917720daec60 Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.804548 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.805294 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.305277155 +0000 UTC m=+143.784789682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.866769 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-49n75" podStartSLOduration=123.866751731 podStartE2EDuration="2m3.866751731s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:05.832686876 +0000 UTC m=+143.312199413" watchObservedRunningTime="2026-02-17 00:08:05.866751731 +0000 UTC m=+143.346264258" Feb 17 00:08:05 crc kubenswrapper[4791]: I0217 00:08:05.906166 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:05 crc kubenswrapper[4791]: E0217 00:08:05.906599 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.406588112 +0000 UTC m=+143.886100639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.006886 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.006984 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.50696922 +0000 UTC m=+143.986481747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.007239 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.008030 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.508020853 +0000 UTC m=+143.987533370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.018605 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.024091 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.035301 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.088853 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-72t6m"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.112240 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.112530 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.612517813 +0000 UTC m=+144.092030340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.121719 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv"] Feb 17 00:08:06 crc kubenswrapper[4791]: W0217 00:08:06.127890 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb516ca_988f_4b77_aea0_10cd22ce2b77.slice/crio-c1926e7439e68bf34f98580c0dfdb9a54edefab6aa0f87ce086fac43ffb8969c WatchSource:0}: Error finding container c1926e7439e68bf34f98580c0dfdb9a54edefab6aa0f87ce086fac43ffb8969c: Status 404 returned error can't find the container with id c1926e7439e68bf34f98580c0dfdb9a54edefab6aa0f87ce086fac43ffb8969c Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.136018 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29521440-k6f7k" podStartSLOduration=124.135995498 podStartE2EDuration="2m4.135995498s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:06.124599902 +0000 UTC m=+143.604112429" watchObservedRunningTime="2026-02-17 00:08:06.135995498 +0000 UTC m=+143.615508015" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.139027 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.190520 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-946wq"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.194097 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.196444 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4chtt"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.202591 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.213369 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.213755 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.713739398 +0000 UTC m=+144.193251925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.240475 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tlpgd"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.241409 4791 generic.go:334] "Generic (PLEG): container finished" podID="1b1913d4-85d3-4596-acea-6e272cf81e8e" containerID="40fb24938620be8b1d416e78d154ed948754865f026cb8ac47e2f7cdff5ab937" exitCode=0 Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.241471 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" event={"ID":"1b1913d4-85d3-4596-acea-6e272cf81e8e","Type":"ContainerDied","Data":"40fb24938620be8b1d416e78d154ed948754865f026cb8ac47e2f7cdff5ab937"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.252769 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" event={"ID":"5e2cf0a7-86f6-4858-98ad-08c4c644deb9","Type":"ContainerStarted","Data":"95328a6e95fda24b2e97b8fc71783b3f7f2bbed00ffdfd64eb7ca9514a883552"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.253150 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" event={"ID":"5e2cf0a7-86f6-4858-98ad-08c4c644deb9","Type":"ContainerStarted","Data":"4a8e89134aa3ad074635175763043d317ffc6ed9e2fc49c3f8ca922783b89cf5"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.258743 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" event={"ID":"9578978b-522d-48d8-9b08-384752fc49a1","Type":"ContainerStarted","Data":"7f331e409b51489b981cf18efb64a20186608a5eff470f354f95c51f4220dbe4"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.261899 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" event={"ID":"0b747aa6-3874-4f71-86bb-d340398d7bc4","Type":"ContainerStarted","Data":"bf6c044a923c9a078ef93cd0ae0652e77b223f659eca082644bf22f9fba35dff"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.274699 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" event={"ID":"fe44c059-87ef-4805-b78f-b8c3cdfd844e","Type":"ContainerStarted","Data":"5834b63100c93cd9f5d7ab15fdceb720d8fe47ee8d67789995b9fafee6f966ba"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.278493 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rt865" event={"ID":"0522c983-dae6-41ca-807a-ff45912a0024","Type":"ContainerStarted","Data":"75e7a03529d69627bae6599e10cd571ed8062d4194733ab6148d56f13ab20292"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.278525 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rt865" event={"ID":"0522c983-dae6-41ca-807a-ff45912a0024","Type":"ContainerStarted","Data":"a66362d342c250fa63e0df4225566f96b7e347652c5e248e46f436271cd8a461"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.278765 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.279968 4791 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt865 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.280045 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt865" podUID="0522c983-dae6-41ca-807a-ff45912a0024" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.286794 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" event={"ID":"6a866a69-9159-4dd1-a03d-b2a0f703fb7b","Type":"ContainerStarted","Data":"d077feb7a29e2e612d7324e3f4804db6415f0167d613f565b60c6d0f2bcce8f4"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.286837 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" event={"ID":"6a866a69-9159-4dd1-a03d-b2a0f703fb7b","Type":"ContainerStarted","Data":"0c85ff67a65174e4212f77cdeae113e56a44995c48f0f9d56c7ed9adda3bd480"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.287793 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.289520 4791 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ht455 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.289565 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.292966 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.293526 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" event={"ID":"5f00b345-a265-41cc-89b7-6f059fc4d5d1","Type":"ContainerStarted","Data":"b7a8b18536fa6767c60f309e08d47207638d465f657bf2bdbc7d3f33f8420cdd"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.295952 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" event={"ID":"9c752f56-7754-4718-aea5-cb41d6ac4253","Type":"ContainerStarted","Data":"ce9274f47cae1c20fec77be0219875543af87f8c562bbea7299d31cef267add7"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.295992 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" event={"ID":"9c752f56-7754-4718-aea5-cb41d6ac4253","Type":"ContainerStarted","Data":"959c67ba56cd2f54f4dff7177cf5df6ac41f5fff7c2ee1fbec83484cb2387fd7"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.297179 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" event={"ID":"e3032312-913c-4072-ac18-56fdc689cbac","Type":"ContainerStarted","Data":"6b5d87aecd7ce403be0e26e7def9943ae61b90ac529666b24e43dee5fa092938"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.297989 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" event={"ID":"8543e6a7-7bb0-4a35-96c5-bcae0763cc78","Type":"ContainerStarted","Data":"2477c6f12128d4912d9d839f8fff599ccaef2a4115288be43d6389a6b9f92bb1"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.299318 4791 generic.go:334] "Generic (PLEG): container finished" podID="643578b4-75ca-4765-8df5-9167688e3ced" containerID="caa80972cddb40983cf80f6f3371bdce39531bf5d3a8e10cac13ecf851412e29" exitCode=0 Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.299363 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" event={"ID":"643578b4-75ca-4765-8df5-9167688e3ced","Type":"ContainerDied","Data":"caa80972cddb40983cf80f6f3371bdce39531bf5d3a8e10cac13ecf851412e29"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.300305 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" event={"ID":"643578b4-75ca-4765-8df5-9167688e3ced","Type":"ContainerStarted","Data":"7bb16cbc0b60bc99bce5603cc5d99fefa64bde3c544cd8cd956d59cf9ea83790"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.301221 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" event={"ID":"03d7a8df-a8a3-4b34-bd28-d554ae70875a","Type":"ContainerStarted","Data":"5b641221af4812862b561709d03b916acf859293fea57d3885afe67acc85e731"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.303270 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-t5827" event={"ID":"4360bf41-9e45-498e-8f94-2c43a0dc88e5","Type":"ContainerStarted","Data":"b64b8ec32795b5b45afa67cf013bcbd40ce1d7f9297fbba36a4eaaef9064bd6d"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.304224 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.306309 4791 patch_prober.go:28] interesting pod/console-operator-58897d9998-t5827 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.306347 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-t5827" podUID="4360bf41-9e45-498e-8f94-2c43a0dc88e5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.314844 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.315276 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.815254732 +0000 UTC m=+144.294767259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.316491 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" event={"ID":"3d469ce1-e7ed-4826-a378-0de16f2b4e56","Type":"ContainerStarted","Data":"98ec5fd16ca8e4ce986234c99077ede912bf552eaab0db59db3d4daa61506b98"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.321750 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" event={"ID":"b7f0fa93-740f-43aa-9350-24d9920a9345","Type":"ContainerStarted","Data":"17ee9158e3fdfd68777a3f05d407f424eb25dc139182fc00aacd2bde56bdfe08"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.321784 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" event={"ID":"b7f0fa93-740f-43aa-9350-24d9920a9345","Type":"ContainerStarted","Data":"ebd3e1d0b20d4e9d74101c27aed6853f5a3c379ff19f48e52abf9f9e0ca45145"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.324532 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d7qrh" podStartSLOduration=124.324518499 podStartE2EDuration="2m4.324518499s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:06.309415934 +0000 UTC m=+143.788928461" watchObservedRunningTime="2026-02-17 00:08:06.324518499 +0000 UTC m=+143.804031026" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.325925 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7"] Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.326234 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" event={"ID":"c5fb65f7-7cc6-4834-853e-a91eebc956fd","Type":"ContainerStarted","Data":"4db7223c621dad4ca4a2cf4fdc79db21ea445ff73894457d60af10a67d532fa1"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.334216 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" event={"ID":"c70fe9d3-348d-4bb8-89f7-21027041131a","Type":"ContainerStarted","Data":"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.334263 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" event={"ID":"c70fe9d3-348d-4bb8-89f7-21027041131a","Type":"ContainerStarted","Data":"bc92c2848641b389e30636fc84dbaa434604ffad03bf817e464c4e944f11c4ea"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.334386 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.336483 4791 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r8zpf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.336515 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.337512 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" event={"ID":"5526b957-e33f-4952-8dda-d2875c94686a","Type":"ContainerStarted","Data":"db531583ab8b781dd61dfe3d2878986ca0106d031f65f34ed6d98ea59985f958"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.343091 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5bsn7" event={"ID":"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd","Type":"ContainerStarted","Data":"4f48b0b1fc00d23e7619acb357b92513cac2e30b3d59e85597d23f892fcaa983"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.343183 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5bsn7" event={"ID":"d2cd3da0-c3e4-461c-93ed-064c8b3b0edd","Type":"ContainerStarted","Data":"1658775be87da06af96b858aee3c59d0f774214e3ca4092a0f764a8ff07f1081"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.347965 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" event={"ID":"ad62ba3d-c60a-4e1f-9768-187e74151f24","Type":"ContainerStarted","Data":"5bbd0db20373a7441f5bb6d421e3277a5d8b5d7d851ff90580117407daeb995a"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.348003 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" event={"ID":"ad62ba3d-c60a-4e1f-9768-187e74151f24","Type":"ContainerStarted","Data":"426d716b18b9a71736268368767e3928361e1d1bb985b12b162b76f2351d832b"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.353982 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kt8q6" event={"ID":"459f3992-b770-44d7-9ecc-0ae8a228134f","Type":"ContainerStarted","Data":"b7a95c7efddf0dd9f9591d69cbb3badc012e04c94fd97824a044926e6b6f6c9a"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.362194 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" event={"ID":"713c3460-f77d-4f7b-81bf-911f8f875dfe","Type":"ContainerStarted","Data":"eccdaaec958face5d5dcfd6c5fddbc0b4b13a69d1c98503b7abb4bafa6bcd4bc"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.362888 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:06 crc kubenswrapper[4791]: W0217 00:08:06.364000 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae8af772_70a9_4758_b597_363c1db463ad.slice/crio-1f80b66be0bf331fa0415fc5c02f9f102805b77c52a528b4f2d720eb3e6a3d92 WatchSource:0}: Error finding container 1f80b66be0bf331fa0415fc5c02f9f102805b77c52a528b4f2d720eb3e6a3d92: Status 404 returned error can't find the container with id 1f80b66be0bf331fa0415fc5c02f9f102805b77c52a528b4f2d720eb3e6a3d92 Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.368163 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" event={"ID":"b6c19ecc-0208-46de-8c03-6780bba30353","Type":"ContainerStarted","Data":"69b342b3e6d027d571b603d15930a15b414ae48ff61d21acef0575eabb9dfc2d"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.369856 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" event={"ID":"afb516ca-988f-4b77-aea0-10cd22ce2b77","Type":"ContainerStarted","Data":"c1926e7439e68bf34f98580c0dfdb9a54edefab6aa0f87ce086fac43ffb8969c"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.370738 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" event={"ID":"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4","Type":"ContainerStarted","Data":"5fbb286dd9c23a26eccebb86c4ca61f9def0db6f356bb9410711a9d2ce0c4e86"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.372029 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" event={"ID":"71967495-8841-4810-89e5-e114b9887c5e","Type":"ContainerStarted","Data":"87a25ad62b20ec9733ec4fbfd91d80ed15aa968b40372fc7f787917720daec60"} Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.375300 4791 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jftdn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.375372 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.416958 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.421369 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:06.921354173 +0000 UTC m=+144.400866700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.518182 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.518458 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.018422034 +0000 UTC m=+144.497934571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.518795 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.519541 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.01953021 +0000 UTC m=+144.499042737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.620694 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.620998 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.120985561 +0000 UTC m=+144.600498088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.678585 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.704962 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.705010 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.710745 4791 csr.go:261] certificate signing request csr-r7wvn is approved, waiting to be issued Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.718738 4791 csr.go:257] certificate signing request csr-r7wvn is issued Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.721704 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-frmbv" podStartSLOduration=124.721683349 podStartE2EDuration="2m4.721683349s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:06.71796447 +0000 UTC m=+144.197476997" watchObservedRunningTime="2026-02-17 00:08:06.721683349 +0000 UTC m=+144.201195876" Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.722295 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.722617 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.222607199 +0000 UTC m=+144.702119726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.823279 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.826251 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.32623169 +0000 UTC m=+144.805744217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:06 crc kubenswrapper[4791]: I0217 00:08:06.927263 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:06 crc kubenswrapper[4791]: E0217 00:08:06.927581 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.427569729 +0000 UTC m=+144.907082256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:06.999227 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" podStartSLOduration=124.999207971 podStartE2EDuration="2m4.999207971s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:06.957604184 +0000 UTC m=+144.437116711" watchObservedRunningTime="2026-02-17 00:08:06.999207971 +0000 UTC m=+144.478720498" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.028899 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.029401 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.529382361 +0000 UTC m=+145.008894898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.032722 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" podStartSLOduration=124.032703718 podStartE2EDuration="2m4.032703718s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:06.999722758 +0000 UTC m=+144.479235285" watchObservedRunningTime="2026-02-17 00:08:07.032703718 +0000 UTC m=+144.512216255" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.082408 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-stqb9" podStartSLOduration=125.082374016 podStartE2EDuration="2m5.082374016s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.037039078 +0000 UTC m=+144.516551605" watchObservedRunningTime="2026-02-17 00:08:07.082374016 +0000 UTC m=+144.561886543" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.121023 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8lhkg" podStartSLOduration=125.120995047 podStartE2EDuration="2m5.120995047s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.120380107 +0000 UTC m=+144.599892644" watchObservedRunningTime="2026-02-17 00:08:07.120995047 +0000 UTC m=+144.600507574" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.122614 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" podStartSLOduration=125.122605858 podStartE2EDuration="2m5.122605858s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.084334189 +0000 UTC m=+144.563846716" watchObservedRunningTime="2026-02-17 00:08:07.122605858 +0000 UTC m=+144.602118385" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.133533 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.134222 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.634207711 +0000 UTC m=+145.113720238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.191823 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qzgkr" podStartSLOduration=125.191799243 podStartE2EDuration="2m5.191799243s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.19136697 +0000 UTC m=+144.670879497" watchObservedRunningTime="2026-02-17 00:08:07.191799243 +0000 UTC m=+144.671311770" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.197919 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-t5827" podStartSLOduration=125.197900449 podStartE2EDuration="2m5.197900449s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.164458734 +0000 UTC m=+144.643971271" watchObservedRunningTime="2026-02-17 00:08:07.197900449 +0000 UTC m=+144.677412976" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.239661 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.240615 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.739800876 +0000 UTC m=+145.219313403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.240906 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.241194 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.741186721 +0000 UTC m=+145.220699248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.281029 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fpr65" podStartSLOduration=125.281010352 podStartE2EDuration="2m5.281010352s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.280433133 +0000 UTC m=+144.759945660" watchObservedRunningTime="2026-02-17 00:08:07.281010352 +0000 UTC m=+144.760522879" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.341552 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.342089 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.842075895 +0000 UTC m=+145.321588422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.367496 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-kt8q6" podStartSLOduration=124.367475351 podStartE2EDuration="2m4.367475351s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.320711117 +0000 UTC m=+144.800223644" watchObservedRunningTime="2026-02-17 00:08:07.367475351 +0000 UTC m=+144.846987878" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.390807 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fpv5b" podStartSLOduration=124.39078679 podStartE2EDuration="2m4.39078679s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.388997223 +0000 UTC m=+144.868509750" watchObservedRunningTime="2026-02-17 00:08:07.39078679 +0000 UTC m=+144.870299317" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.442545 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" event={"ID":"57ed01e7-bfeb-428e-88c2-371662581ddf","Type":"ContainerStarted","Data":"47db9cbaac3ec07815f8620ce83195c7b16b1455be8613b9e97946b41fce37d2"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.442602 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" event={"ID":"57ed01e7-bfeb-428e-88c2-371662581ddf","Type":"ContainerStarted","Data":"2c1890499027e2938c74ddde9e2725343a834c43969364e8ddfd9209318e0174"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.445182 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.447410 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:07.947392781 +0000 UTC m=+145.426905318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.448561 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" event={"ID":"13a5be44-f180-42a9-bff7-8ba69cc589f0","Type":"ContainerStarted","Data":"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.448598 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" event={"ID":"13a5be44-f180-42a9-bff7-8ba69cc589f0","Type":"ContainerStarted","Data":"f1a3439d45cbb877a9cdb806affb8d5e0982a3ff436258b9fc60b97b89a3ef01"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.449253 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.455661 4791 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bfffb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.455718 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.457566 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" event={"ID":"c5fb65f7-7cc6-4834-853e-a91eebc956fd","Type":"ContainerStarted","Data":"81b427b7573e4cf9cfbb7bccb1f44c300b9035efb6402caed9a3e5c28a75be2c"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.457692 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" event={"ID":"c5fb65f7-7cc6-4834-853e-a91eebc956fd","Type":"ContainerStarted","Data":"b5070e857243a46a8b6ec53eb4bd092d26e78b3683e50f8c6f17a30b85de8f06"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.464677 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-rt865" podStartSLOduration=125.464660796 podStartE2EDuration="2m5.464660796s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.436080867 +0000 UTC m=+144.915593394" watchObservedRunningTime="2026-02-17 00:08:07.464660796 +0000 UTC m=+144.944173323" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.465370 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-5bsn7" podStartSLOduration=6.465362809 podStartE2EDuration="6.465362809s" podCreationTimestamp="2026-02-17 00:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.462973022 +0000 UTC m=+144.942485549" watchObservedRunningTime="2026-02-17 00:08:07.465362809 +0000 UTC m=+144.944875336" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.468966 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" event={"ID":"afb516ca-988f-4b77-aea0-10cd22ce2b77","Type":"ContainerStarted","Data":"4cb4c39123a8be97c43cf0e8daee53a169b6ed27a15a587373572b7e66c5f434"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.478246 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" event={"ID":"b6c19ecc-0208-46de-8c03-6780bba30353","Type":"ContainerStarted","Data":"88b8363acadb5b09be62aa6d3adc4a7a5a77f1c9201ddd017a5c9eafc8b56850"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.478786 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.480919 4791 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-bt9mf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.480955 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" podUID="b6c19ecc-0208-46de-8c03-6780bba30353" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.481829 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" event={"ID":"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9","Type":"ContainerStarted","Data":"adccb50d3753547360cab0c817bc0d7632a6d724c3247fc54920b604edd01bb6"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.481859 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" event={"ID":"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9","Type":"ContainerStarted","Data":"91b7f18e5cbfceaf615e5d9ab87af73003d95c6a41afd231c4fdf2400c98cb75"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.486971 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" event={"ID":"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4","Type":"ContainerStarted","Data":"8947c56ad1abf6d6fce581ea563cbdab0f0532d6619fdfa98830e8a9e3f4a72c"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.493365 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" event={"ID":"71967495-8841-4810-89e5-e114b9887c5e","Type":"ContainerStarted","Data":"8e11fc249b980b74afafbdb667e9dd6da9a26fff5a79d4df56b4c2561adf95c2"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.493419 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" event={"ID":"71967495-8841-4810-89e5-e114b9887c5e","Type":"ContainerStarted","Data":"4a592c72a5c70f3b607d8355518fa2a61e41d24c5046c8917097386958e68595"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.494209 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.497552 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" event={"ID":"0b747aa6-3874-4f71-86bb-d340398d7bc4","Type":"ContainerStarted","Data":"2c221fab134d5bb1527c120a90943844b5e7c53ff11f82640c354dffd9e48720"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.500543 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" event={"ID":"9c752f56-7754-4718-aea5-cb41d6ac4253","Type":"ContainerStarted","Data":"fd12d2709dab0233fec62f39ff499f32b9c663664b15a82880186278c4b6e39c"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.502995 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" event={"ID":"7bf3a6ce-9a2b-49cc-9360-f552988f2b38","Type":"ContainerStarted","Data":"7dc5812277568e4be7c6d93c6c7690e193a15bbfb1d865aabdbab9ae66fc01f9"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.503022 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" event={"ID":"7bf3a6ce-9a2b-49cc-9360-f552988f2b38","Type":"ContainerStarted","Data":"2154b5e50cacaf67ad92bfbf591488e9a51c3d8f281cddf4f2860670bbd04712"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.504319 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.505704 4791 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dhmqq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.505757 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" podUID="7bf3a6ce-9a2b-49cc-9360-f552988f2b38" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.521472 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nlbr" podStartSLOduration=125.521458552 podStartE2EDuration="2m5.521458552s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.520967076 +0000 UTC m=+145.000479603" watchObservedRunningTime="2026-02-17 00:08:07.521458552 +0000 UTC m=+145.000971079" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.550047 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.551480 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.051449816 +0000 UTC m=+145.530962423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.562642 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" event={"ID":"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c","Type":"ContainerStarted","Data":"891df08438b436b0b07d8427461e11fc0c9eb7e638150952cf235c2137b716b6"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.563006 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" event={"ID":"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c","Type":"ContainerStarted","Data":"a39ba9373a1c8330d92d717ecb292520ee487332b22cd14b0e5fe57ebfb54ebe"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.564439 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" podStartSLOduration=124.564417943 podStartE2EDuration="2m4.564417943s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.563897827 +0000 UTC m=+145.043410344" watchObservedRunningTime="2026-02-17 00:08:07.564417943 +0000 UTC m=+145.043930470" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.580406 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" event={"ID":"9578978b-522d-48d8-9b08-384752fc49a1","Type":"ContainerStarted","Data":"73dce2445581148c9aaac0ae8e3c3141a42b2fd397bc6713d36dce2b9e9f8734"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.594209 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4chtt" event={"ID":"eeffaf81-97bf-4570-b2f4-4692c4bda9ac","Type":"ContainerStarted","Data":"465e4b0a5eabb9bcd00a642ab6edbddbf334af705cc21d7f8cc83097c19b70c9"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.597004 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-952sm" podStartSLOduration=124.59698501 podStartE2EDuration="2m4.59698501s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.593667484 +0000 UTC m=+145.073180011" watchObservedRunningTime="2026-02-17 00:08:07.59698501 +0000 UTC m=+145.076497537" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.617019 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" event={"ID":"8543e6a7-7bb0-4a35-96c5-bcae0763cc78","Type":"ContainerStarted","Data":"397616aa3db0a71bad5937164d93b4b861dc501caa61b23b59368d98023799d5"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.623695 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" event={"ID":"1b1913d4-85d3-4596-acea-6e272cf81e8e","Type":"ContainerStarted","Data":"d4e158158a9ecd782e740e08c84c8fed9941911050983e345a832d4aaf89547b"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.645675 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5nwz7" podStartSLOduration=124.645654255 podStartE2EDuration="2m4.645654255s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.642505983 +0000 UTC m=+145.122018520" watchObservedRunningTime="2026-02-17 00:08:07.645654255 +0000 UTC m=+145.125166782" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.652059 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.657823 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" event={"ID":"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23","Type":"ContainerStarted","Data":"b94bcc5ae36e521ca026383a0c5e72a801fc0341a5b3ffe868661eb976b4bc13"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.657868 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" event={"ID":"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23","Type":"ContainerStarted","Data":"19d706732e0f8e2555cd52821bb9ca106e51d445f5f15e844321a815eb6e1524"} Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.658542 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.158522529 +0000 UTC m=+145.638035056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.690288 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:07 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:07 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:07 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.690340 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.711437 4791 generic.go:334] "Generic (PLEG): container finished" podID="ad62ba3d-c60a-4e1f-9768-187e74151f24" containerID="5bbd0db20373a7441f5bb6d421e3277a5d8b5d7d851ff90580117407daeb995a" exitCode=0 Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.711527 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" event={"ID":"ad62ba3d-c60a-4e1f-9768-187e74151f24","Type":"ContainerDied","Data":"5bbd0db20373a7441f5bb6d421e3277a5d8b5d7d851ff90580117407daeb995a"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.712108 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.714190 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" podStartSLOduration=124.714178368 podStartE2EDuration="2m4.714178368s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.684937708 +0000 UTC m=+145.164450255" watchObservedRunningTime="2026-02-17 00:08:07.714178368 +0000 UTC m=+145.193690895" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.726061 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 00:03:06 +0000 UTC, rotation deadline is 2026-12-16 14:11:36.826202543 +0000 UTC Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.726100 4791 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7262h3m29.100104771s for next certificate rotation Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.757011 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.768559 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.268528355 +0000 UTC m=+145.748040882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.782315 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nb5r" podStartSLOduration=124.782296259 podStartE2EDuration="2m4.782296259s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.781688229 +0000 UTC m=+145.261200756" watchObservedRunningTime="2026-02-17 00:08:07.782296259 +0000 UTC m=+145.261808786" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.784197 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5jsj6" podStartSLOduration=124.784186209 podStartE2EDuration="2m4.784186209s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.713189846 +0000 UTC m=+145.192702373" watchObservedRunningTime="2026-02-17 00:08:07.784186209 +0000 UTC m=+145.263698736" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.804061 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" event={"ID":"643578b4-75ca-4765-8df5-9167688e3ced","Type":"ContainerStarted","Data":"62f51fb9424beb412ed9ef02bcaf7076ed9292d3aa6ee47b9a982216b2afd220"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.841273 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" podStartSLOduration=124.841255384 podStartE2EDuration="2m4.841255384s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.839058123 +0000 UTC m=+145.318570660" watchObservedRunningTime="2026-02-17 00:08:07.841255384 +0000 UTC m=+145.320767911" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.842707 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-946wq" event={"ID":"1ce1b285-b6aa-4361-aa9e-5274a9863b6a","Type":"ContainerStarted","Data":"907f113a9eb0cb9413583d679214fec1c783c5575065ad366e0da999ba69a20a"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.842861 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-946wq" event={"ID":"1ce1b285-b6aa-4361-aa9e-5274a9863b6a","Type":"ContainerStarted","Data":"4186455468f535264d84e387847aaf3597f1545f4f7b77b7e46747846bfb16be"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.860051 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.860432 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.360421371 +0000 UTC m=+145.839933898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.872160 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" event={"ID":"fe44c059-87ef-4805-b78f-b8c3cdfd844e","Type":"ContainerStarted","Data":"d862b93abaa637b5ab55ebf1f661d266f0d2f4eb72a89f6d98beacebc81b9836"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.885343 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" event={"ID":"e3032312-913c-4072-ac18-56fdc689cbac","Type":"ContainerStarted","Data":"75ef4aa7afdc141b6e36cf7b02be25daeea88d141e2cdf4b633c01c8c4a99163"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.896017 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" event={"ID":"811e9e22-1241-440a-9a6a-a6c51a0f0f7c","Type":"ContainerStarted","Data":"8e87bff544ec51acf1da27e383e4d3a8706325f1b9fe8906f2ac798c476888bc"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.904188 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" podStartSLOduration=124.904173007 podStartE2EDuration="2m4.904173007s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.90085212 +0000 UTC m=+145.380364647" watchObservedRunningTime="2026-02-17 00:08:07.904173007 +0000 UTC m=+145.383685534" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.905794 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" podStartSLOduration=125.905784988 podStartE2EDuration="2m5.905784988s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.883312596 +0000 UTC m=+145.362825123" watchObservedRunningTime="2026-02-17 00:08:07.905784988 +0000 UTC m=+145.385297515" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.910642 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" event={"ID":"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101","Type":"ContainerStarted","Data":"7417c148613956dc9e2a33a3c1c857f51db890058385f1ed9572da09e498bcfe"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.910745 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" event={"ID":"1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101","Type":"ContainerStarted","Data":"cc582c53139208e9ff16dceeec5c6852aa1e467a790f13ee77acbbaad464c1e0"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.911446 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.916197 4791 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hb9dg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.916252 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" podUID="1ef4b62b-1af2-4d7b-90cf-3ac35a4bf101" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.935263 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" event={"ID":"03d7a8df-a8a3-4b34-bd28-d554ae70875a","Type":"ContainerStarted","Data":"87d0f65c46e40821190dfe62fb590339519d2031f9ca103c8cef0bd4136479e1"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.960741 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:07 crc kubenswrapper[4791]: E0217 00:08:07.962394 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.462379318 +0000 UTC m=+145.941891835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.964355 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-946wq" podStartSLOduration=6.964339011 podStartE2EDuration="6.964339011s" podCreationTimestamp="2026-02-17 00:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.927342702 +0000 UTC m=+145.406855229" watchObservedRunningTime="2026-02-17 00:08:07.964339011 +0000 UTC m=+145.443851538" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.965917 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" podStartSLOduration=124.965910252 podStartE2EDuration="2m4.965910252s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:07.962913096 +0000 UTC m=+145.442425613" watchObservedRunningTime="2026-02-17 00:08:07.965910252 +0000 UTC m=+145.445422779" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.976282 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" event={"ID":"ae8af772-70a9-4758-b597-363c1db463ad","Type":"ContainerStarted","Data":"b5e8825abe9a3565f618a3e4c2364368a7b0c7552d26b7768a158f2c23435e88"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.976317 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" event={"ID":"ae8af772-70a9-4758-b597-363c1db463ad","Type":"ContainerStarted","Data":"1f80b66be0bf331fa0415fc5c02f9f102805b77c52a528b4f2d720eb3e6a3d92"} Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.978859 4791 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt865 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.978920 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt865" podUID="0522c983-dae6-41ca-807a-ff45912a0024" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.979366 4791 patch_prober.go:28] interesting pod/console-operator-58897d9998-t5827 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.979402 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-t5827" podUID="4360bf41-9e45-498e-8f94-2c43a0dc88e5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 17 00:08:07 crc kubenswrapper[4791]: I0217 00:08:07.992419 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.033655 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" podStartSLOduration=126.03363591 podStartE2EDuration="2m6.03363591s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.001165195 +0000 UTC m=+145.480677742" watchObservedRunningTime="2026-02-17 00:08:08.03363591 +0000 UTC m=+145.513148437" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.035328 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57qch" podStartSLOduration=125.035323324 podStartE2EDuration="2m5.035323324s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.031837651 +0000 UTC m=+145.511350178" watchObservedRunningTime="2026-02-17 00:08:08.035323324 +0000 UTC m=+145.514835851" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.064376 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.075511 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.575492565 +0000 UTC m=+146.055005192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.087909 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fs52s" podStartSLOduration=125.087890773 podStartE2EDuration="2m5.087890773s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.087515591 +0000 UTC m=+145.567028118" watchObservedRunningTime="2026-02-17 00:08:08.087890773 +0000 UTC m=+145.567403290" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.118501 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.139805 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" podStartSLOduration=125.139791593 podStartE2EDuration="2m5.139791593s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.137262242 +0000 UTC m=+145.616774769" watchObservedRunningTime="2026-02-17 00:08:08.139791593 +0000 UTC m=+145.619304120" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.170553 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.171039 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.671025057 +0000 UTC m=+146.150537584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.221769 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kqr99" podStartSLOduration=125.221749498 podStartE2EDuration="2m5.221749498s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.201065323 +0000 UTC m=+145.680577850" watchObservedRunningTime="2026-02-17 00:08:08.221749498 +0000 UTC m=+145.701262025" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.222945 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" podStartSLOduration=125.222939226 podStartE2EDuration="2m5.222939226s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.160277521 +0000 UTC m=+145.639790048" watchObservedRunningTime="2026-02-17 00:08:08.222939226 +0000 UTC m=+145.702451753" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.272793 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.273096 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.773083898 +0000 UTC m=+146.252596415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.348761 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" podStartSLOduration=125.348740981 podStartE2EDuration="2m5.348740981s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.347354727 +0000 UTC m=+145.826867244" watchObservedRunningTime="2026-02-17 00:08:08.348740981 +0000 UTC m=+145.828253508" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.375624 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.375776 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.87575608 +0000 UTC m=+146.355268607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.375990 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.376158 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xc8t7" podStartSLOduration=125.376137171 podStartE2EDuration="2m5.376137171s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:08.373099663 +0000 UTC m=+145.852612190" watchObservedRunningTime="2026-02-17 00:08:08.376137171 +0000 UTC m=+145.855649698" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.376310 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.876301876 +0000 UTC m=+146.355814403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.478577 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.478953 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:08.978937067 +0000 UTC m=+146.458449594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.579787 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.580110 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.08008794 +0000 UTC m=+146.559600457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.677663 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:08 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:08 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:08 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.677720 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.681080 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.681216 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.18119715 +0000 UTC m=+146.660709677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.681410 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.681752 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.181741827 +0000 UTC m=+146.661254354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.757133 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.782226 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.782604 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.28259028 +0000 UTC m=+146.762102807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.884328 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.887491 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.387477553 +0000 UTC m=+146.866990080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.989032 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.989846 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.489569225 +0000 UTC m=+146.969081752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:08 crc kubenswrapper[4791]: I0217 00:08:08.990245 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:08 crc kubenswrapper[4791]: E0217 00:08:08.990585 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.490574107 +0000 UTC m=+146.970086634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.004543 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" event={"ID":"1b1913d4-85d3-4596-acea-6e272cf81e8e","Type":"ContainerStarted","Data":"beb21327d747d81874b1f2bc384ac019160f075efa5e3693137edea5df10aad0"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.013712 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" event={"ID":"afb516ca-988f-4b77-aea0-10cd22ce2b77","Type":"ContainerStarted","Data":"95225305956cc307b1fd95289e86fe96f4a3d50e669118b9c57071737580a1fe"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.017966 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n2s28" event={"ID":"fe44c059-87ef-4805-b78f-b8c3cdfd844e","Type":"ContainerStarted","Data":"ad14918ea54f5f65c676b3347afb288620684183969cb994bb4b1b542057f1a3"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.021127 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" event={"ID":"811e9e22-1241-440a-9a6a-a6c51a0f0f7c","Type":"ContainerStarted","Data":"b062dbdf343b3684687d313c0a8e3c0a2c50ee9846d661195465b4147adac306"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.023848 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" event={"ID":"ad62ba3d-c60a-4e1f-9768-187e74151f24","Type":"ContainerStarted","Data":"6581a449ea38bab0e4535ff76ee15c3ea69bca38a74bdbd7c31b9de1ff9d756e"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.025594 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" event={"ID":"ee8894f1-34ac-4df2-bf1a-01e1a110d6c9","Type":"ContainerStarted","Data":"c1fc365db21355adc10ca4434fc6bfa3c4562b2be333b5195506d687a0ab5b98"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.029220 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" event={"ID":"c50f96b4-3a86-4edc-b9d5-82fe3181b8a4","Type":"ContainerStarted","Data":"ce2dff7d47e9dd38a5436e594237da01da95c1f17a299dc45787ca15d700b4d2"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.031656 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vb8fv" event={"ID":"3baa46c8-1f0b-4b6f-95bb-94368bf6cc23","Type":"ContainerStarted","Data":"64685b3eaf469ada0cd4ed21bf6b0dd74fa2905ced4ac43eccb0f2d80dce5b32"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.034956 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4chtt" event={"ID":"eeffaf81-97bf-4570-b2f4-4692c4bda9ac","Type":"ContainerStarted","Data":"c70e43d973b6126077673278b53e1b32835f0991ddb2d79c1c137fe06938c45b"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.035082 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4chtt" event={"ID":"eeffaf81-97bf-4570-b2f4-4692c4bda9ac","Type":"ContainerStarted","Data":"da29cdcbc1a24aabb2bd2c21956f85a3dc79ca23b54b9abc3e41aeff45f5b102"} Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.037499 4791 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bfffb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.037538 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.056946 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhmqq" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.059530 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.059684 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.060261 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hb9dg" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.061090 4791 patch_prober.go:28] interesting pod/apiserver-76f77b778f-flvjk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.061157 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" podUID="1b1913d4-85d3-4596-acea-6e272cf81e8e" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.091651 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.091826 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.591799122 +0000 UTC m=+147.071311649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.093618 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.095298 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.595286464 +0000 UTC m=+147.074798991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.099540 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" podStartSLOduration=127.09952487 podStartE2EDuration="2m7.09952487s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:09.098643851 +0000 UTC m=+146.578156378" watchObservedRunningTime="2026-02-17 00:08:09.09952487 +0000 UTC m=+146.579037397" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.195136 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.196628 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.696613351 +0000 UTC m=+147.176125878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.299853 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.300354 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.800341907 +0000 UTC m=+147.279854434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.351316 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-72t6m" podStartSLOduration=127.351295765 podStartE2EDuration="2m7.351295765s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:09.350309753 +0000 UTC m=+146.829822280" watchObservedRunningTime="2026-02-17 00:08:09.351295765 +0000 UTC m=+146.830808292" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.351756 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j7w5g" podStartSLOduration=127.35175279 podStartE2EDuration="2m7.35175279s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:09.307756136 +0000 UTC m=+146.787268663" watchObservedRunningTime="2026-02-17 00:08:09.35175279 +0000 UTC m=+146.831265317" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.399212 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.399531 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.401745 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.402070 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:09.902053957 +0000 UTC m=+147.381566484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.451565 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4chtt" podStartSLOduration=8.451548918 podStartE2EDuration="8.451548918s" podCreationTimestamp="2026-02-17 00:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:09.450668521 +0000 UTC m=+146.930181048" watchObservedRunningTime="2026-02-17 00:08:09.451548918 +0000 UTC m=+146.931061445" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.486228 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pqm8" podStartSLOduration=126.486205773 podStartE2EDuration="2m6.486205773s" podCreationTimestamp="2026-02-17 00:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:09.485424387 +0000 UTC m=+146.964936914" watchObservedRunningTime="2026-02-17 00:08:09.486205773 +0000 UTC m=+146.965718310" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.492156 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bt9mf" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.504722 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.505078 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.005062169 +0000 UTC m=+147.484574696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.605640 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.606192 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.106155879 +0000 UTC m=+147.585668406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.684759 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:09 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:09 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:09 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.685091 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.713455 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.714742 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.21472014 +0000 UTC m=+147.694232667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.815584 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.816020 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.316006037 +0000 UTC m=+147.795518564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.865153 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:09 crc kubenswrapper[4791]: I0217 00:08:09.916738 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:09 crc kubenswrapper[4791]: E0217 00:08:09.917090 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.417074346 +0000 UTC m=+147.896586873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.025558 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.025774 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.52574606 +0000 UTC m=+148.005258587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.025860 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.026402 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.526391021 +0000 UTC m=+148.005903548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.035709 4791 patch_prober.go:28] interesting pod/console-operator-58897d9998-t5827 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.036000 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-t5827" podUID="4360bf41-9e45-498e-8f94-2c43a0dc88e5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.043749 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" event={"ID":"811e9e22-1241-440a-9a6a-a6c51a0f0f7c","Type":"ContainerStarted","Data":"dba772c5c9ab5f5299b58fe79526e98940d7ac35727f37b1ab8d9d9a2e04639f"} Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.043791 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" event={"ID":"811e9e22-1241-440a-9a6a-a6c51a0f0f7c","Type":"ContainerStarted","Data":"9afeea82085ae371d14fbfa4430b59a5455b1127d46ebbf26203c0321439b678"} Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.044414 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.048914 4791 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bfffb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.048988 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.053241 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sgzjl" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.126882 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.127067 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.627034927 +0000 UTC m=+148.106547454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.127862 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.129782 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.629764525 +0000 UTC m=+148.109277122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.229261 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.229476 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.729447709 +0000 UTC m=+148.208960236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.229526 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.229884 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.729869743 +0000 UTC m=+148.209382270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.330974 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.331180 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.83115408 +0000 UTC m=+148.310666607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.331497 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.331829 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.831821551 +0000 UTC m=+148.311334069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.432699 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.432905 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.932876831 +0000 UTC m=+148.412389358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.433127 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.433462 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:10.933449549 +0000 UTC m=+148.412962076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.446059 4791 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.534651 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.534834 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.034802217 +0000 UTC m=+148.514314744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.535096 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: E0217 00:08:10.535425 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 00:08:11.035408987 +0000 UTC m=+148.514921514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wfpf2" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.538201 4791 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T00:08:10.446095936Z","Handler":null,"Name":""} Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.544540 4791 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.544568 4791 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.636443 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.640902 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.670576 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxk8k" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.682377 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:10 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:10 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:10 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.682437 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.738120 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.755343 4791 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.755380 4791 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.871820 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.872740 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.876364 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.915406 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:08:10 crc kubenswrapper[4791]: I0217 00:08:10.918363 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wfpf2\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.027928 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.029100 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.030528 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.040674 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.040776 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.040820 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq9sd\" (UniqueName: \"kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.068041 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" event={"ID":"811e9e22-1241-440a-9a6a-a6c51a0f0f7c","Type":"ContainerStarted","Data":"a74dd55763b606bc9278adcb53618735d3d9f706a1b407bef735ee336f188514"} Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.093584 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.096997 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.142790 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.142883 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blntw\" (UniqueName: \"kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.142904 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq9sd\" (UniqueName: \"kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.142937 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.142991 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.143092 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.143140 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.143198 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.143253 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.143277 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.144286 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.147955 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.150099 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.151505 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.152451 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.161826 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.168373 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.189442 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.189969 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq9sd\" (UniqueName: \"kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd\") pod \"certified-operators-cgmd4\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.190013 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.208612 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.231170 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tlpgd" podStartSLOduration=10.231149427 podStartE2EDuration="10.231149427s" podCreationTimestamp="2026-02-17 00:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:11.122430391 +0000 UTC m=+148.601942908" watchObservedRunningTime="2026-02-17 00:08:11.231149427 +0000 UTC m=+148.710661954" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.236494 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.237020 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.237955 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.244007 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.244105 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.244214 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blntw\" (UniqueName: \"kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.245211 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.245476 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.245611 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.269014 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blntw\" (UniqueName: \"kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw\") pod \"community-operators-8xbcp\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.345232 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.345477 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.345511 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndp5p\" (UniqueName: \"kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.346435 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.439434 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.440434 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.446653 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.446698 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.446726 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndp5p\" (UniqueName: \"kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.447413 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.453019 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.458665 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.491000 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndp5p\" (UniqueName: \"kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p\") pod \"certified-operators-tt7zf\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.547890 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6fd4\" (UniqueName: \"kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.547986 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.548042 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.553257 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.652549 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fd4\" (UniqueName: \"kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.652614 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.652678 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.653262 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.653504 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.673743 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fd4\" (UniqueName: \"kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4\") pod \"community-operators-b9dcp\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.676772 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:11 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:11 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:11 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.676848 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.777952 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:08:11 crc kubenswrapper[4791]: W0217 00:08:11.784763 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf04d6e19_5c11_4527_8a49_3208098d2575.slice/crio-8181b3c79ba3d257e580e3f1df6f57468b32e1bd3945a81c8cb4156646ea066f WatchSource:0}: Error finding container 8181b3c79ba3d257e580e3f1df6f57468b32e1bd3945a81c8cb4156646ea066f: Status 404 returned error can't find the container with id 8181b3c79ba3d257e580e3f1df6f57468b32e1bd3945a81c8cb4156646ea066f Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.811327 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.819910 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:08:11 crc kubenswrapper[4791]: I0217 00:08:11.825489 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:08:11 crc kubenswrapper[4791]: W0217 00:08:11.834017 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc33165ce_519a_4b0e_b62a_f153d38fc14c.slice/crio-a7ca3d122288b33fa8a847c41ca82278c8736040a34df9221628fbf85c038b55 WatchSource:0}: Error finding container a7ca3d122288b33fa8a847c41ca82278c8736040a34df9221628fbf85c038b55: Status 404 returned error can't find the container with id a7ca3d122288b33fa8a847c41ca82278c8736040a34df9221628fbf85c038b55 Feb 17 00:08:11 crc kubenswrapper[4791]: W0217 00:08:11.835365 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9895217e_9934_4f80_a583_98842d597690.slice/crio-977ab4aa20ed7b60a4814c6d6489439a907a290d464af852a9ad4af4039d6953 WatchSource:0}: Error finding container 977ab4aa20ed7b60a4814c6d6489439a907a290d464af852a9ad4af4039d6953: Status 404 returned error can't find the container with id 977ab4aa20ed7b60a4814c6d6489439a907a290d464af852a9ad4af4039d6953 Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.000698 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:08:12 crc kubenswrapper[4791]: W0217 00:08:12.002039 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-9d33d0a59b1363e6ccc78242f7e8cb6ff6a5b0aa639f99a163c5c4caa48c1335 WatchSource:0}: Error finding container 9d33d0a59b1363e6ccc78242f7e8cb6ff6a5b0aa639f99a163c5c4caa48c1335: Status 404 returned error can't find the container with id 9d33d0a59b1363e6ccc78242f7e8cb6ff6a5b0aa639f99a163c5c4caa48c1335 Feb 17 00:08:12 crc kubenswrapper[4791]: W0217 00:08:12.005639 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-1e35f43d7a85477c414761d38a0214bd527154fb9e5910306b2a5e5a9e67d613 WatchSource:0}: Error finding container 1e35f43d7a85477c414761d38a0214bd527154fb9e5910306b2a5e5a9e67d613: Status 404 returned error can't find the container with id 1e35f43d7a85477c414761d38a0214bd527154fb9e5910306b2a5e5a9e67d613 Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.081465 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fe8fc49335ce9c112e635fb18ef33bc5b86c1cb1a438af4f84b3fc61e068b2b3"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.081538 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"153e79a3bc4b82199aa5f26d5fae710b75335a093fc90ef0af29f47e3107c3f0"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.081970 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.109400 4791 generic.go:334] "Generic (PLEG): container finished" podID="4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" containerID="891df08438b436b0b07d8427461e11fc0c9eb7e638150952cf235c2137b716b6" exitCode=0 Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.109498 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" event={"ID":"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c","Type":"ContainerDied","Data":"891df08438b436b0b07d8427461e11fc0c9eb7e638150952cf235c2137b716b6"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.112122 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerStarted","Data":"f3a8bf4a4e4255984cba5a86035a408c84d7e84e14a3acd43f2d8aaf7ecd5cee"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.115258 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerStarted","Data":"977ab4aa20ed7b60a4814c6d6489439a907a290d464af852a9ad4af4039d6953"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.117172 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" event={"ID":"c33165ce-519a-4b0e-b62a-f153d38fc14c","Type":"ContainerStarted","Data":"a7ca3d122288b33fa8a847c41ca82278c8736040a34df9221628fbf85c038b55"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.118534 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9d33d0a59b1363e6ccc78242f7e8cb6ff6a5b0aa639f99a163c5c4caa48c1335"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.121141 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerStarted","Data":"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.121161 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerStarted","Data":"8181b3c79ba3d257e580e3f1df6f57468b32e1bd3945a81c8cb4156646ea066f"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.121865 4791 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.123699 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1e35f43d7a85477c414761d38a0214bd527154fb9e5910306b2a5e5a9e67d613"} Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.302697 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:12 crc kubenswrapper[4791]: W0217 00:08:12.354623 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e98ac61_7140_4c15_8c29_47676734a52d.slice/crio-b50d297ceb2781d37cf93aa6962f7a2c6eaee4f855e23a5380418bef8dad3d16 WatchSource:0}: Error finding container b50d297ceb2781d37cf93aa6962f7a2c6eaee4f855e23a5380418bef8dad3d16: Status 404 returned error can't find the container with id b50d297ceb2781d37cf93aa6962f7a2c6eaee4f855e23a5380418bef8dad3d16 Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.432520 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.433439 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.436065 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.436611 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.444265 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.566183 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.566289 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.667891 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.667982 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.668100 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.676633 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:12 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:12 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:12 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.676972 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.691709 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:12 crc kubenswrapper[4791]: I0217 00:08:12.745926 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.031180 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.032424 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.041399 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.056802 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.139402 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.149782 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"342c0d8e9fe915056a180d479cef043362d410ea2642f6da8cd8cee34bd4460c"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.171749 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" event={"ID":"c33165ce-519a-4b0e-b62a-f153d38fc14c","Type":"ContainerStarted","Data":"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.172338 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.176080 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2txwc\" (UniqueName: \"kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.176152 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.176217 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.184380 4791 generic.go:334] "Generic (PLEG): container finished" podID="f04d6e19-5c11-4527-8a49-3208098d2575" containerID="18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c" exitCode=0 Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.184505 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerDied","Data":"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.188220 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"62f4c03ce4a00b767928e580f0d8ab3b5b984df57bef049c06764177ce55597c"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.190030 4791 generic.go:334] "Generic (PLEG): container finished" podID="9e98ac61-7140-4c15-8c29-47676734a52d" containerID="d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7" exitCode=0 Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.190131 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerDied","Data":"d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.190168 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerStarted","Data":"b50d297ceb2781d37cf93aa6962f7a2c6eaee4f855e23a5380418bef8dad3d16"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.194786 4791 generic.go:334] "Generic (PLEG): container finished" podID="48855520-658c-4579-a867-7e984bce56c7" containerID="bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82" exitCode=0 Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.194845 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerDied","Data":"bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.211053 4791 generic.go:334] "Generic (PLEG): container finished" podID="9895217e-9934-4f80-a583-98842d597690" containerID="b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33" exitCode=0 Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.212424 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerDied","Data":"b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33"} Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.217231 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" podStartSLOduration=131.217210873 podStartE2EDuration="2m11.217210873s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:13.210226389 +0000 UTC m=+150.689738916" watchObservedRunningTime="2026-02-17 00:08:13.217210873 +0000 UTC m=+150.696723400" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.279436 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2txwc\" (UniqueName: \"kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.279467 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.279562 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.281062 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.281094 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.305033 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2txwc\" (UniqueName: \"kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc\") pod \"redhat-marketplace-h66xr\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.355664 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.458474 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.461498 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.475042 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.607970 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hwtv\" (UniqueName: \"kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.608041 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.608079 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.611541 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.660981 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.676704 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:13 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:13 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:13 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.676989 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709478 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume\") pod \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709586 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume\") pod \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709634 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvk22\" (UniqueName: \"kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22\") pod \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\" (UID: \"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c\") " Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709767 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709833 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hwtv\" (UniqueName: \"kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.709876 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.710290 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.710778 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.710802 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume" (OuterVolumeSpecName: "config-volume") pod "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" (UID: "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.716213 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22" (OuterVolumeSpecName: "kube-api-access-tvk22") pod "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" (UID: "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c"). InnerVolumeSpecName "kube-api-access-tvk22". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.720221 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" (UID: "4dde99c9-4e15-4bdc-ba17-5d00e4117b1c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.729429 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hwtv\" (UniqueName: \"kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv\") pod \"redhat-marketplace-twq6q\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.811448 4791 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.811490 4791 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.811508 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvk22\" (UniqueName: \"kubernetes.io/projected/4dde99c9-4e15-4bdc-ba17-5d00e4117b1c-kube-api-access-tvk22\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:13 crc kubenswrapper[4791]: I0217 00:08:13.847369 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.024288 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:08:14 crc kubenswrapper[4791]: E0217 00:08:14.024822 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" containerName="collect-profiles" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.024839 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" containerName="collect-profiles" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.024953 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dde99c9-4e15-4bdc-ba17-5d00e4117b1c" containerName="collect-profiles" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.025825 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.031020 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.032926 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.066523 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.075076 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-flvjk" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.160717 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.160757 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.162655 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h6gj\" (UniqueName: \"kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.162731 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.162781 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.186359 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-t5827" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.190663 4791 patch_prober.go:28] interesting pod/console-f9d7485db-frmbv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.190705 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-frmbv" podUID="155619c1-12ba-4149-9dce-474e3735168c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.245000 4791 generic.go:334] "Generic (PLEG): container finished" podID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerID="498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd" exitCode=0 Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.245062 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerDied","Data":"498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd"} Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.245088 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerStarted","Data":"0a2777b10322faf11f31316ab253e0d88d658e157f8af01279ebdea911a277fa"} Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.255510 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8b169853-1972-4cb9-9a80-159b0b3456fa","Type":"ContainerStarted","Data":"c3c101c94bb03448c1b4eb251a0fab2366de8d1d9d41e50fd26d9559d10d5bd0"} Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.255549 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8b169853-1972-4cb9-9a80-159b0b3456fa","Type":"ContainerStarted","Data":"6dac1b8347eda9b35ac38349a12edc6d86d8bfa6b545b17f2fe10256ba56df85"} Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.265721 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h6gj\" (UniqueName: \"kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.265867 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.265944 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.266511 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.267303 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.276382 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" event={"ID":"4dde99c9-4e15-4bdc-ba17-5d00e4117b1c","Type":"ContainerDied","Data":"a39ba9373a1c8330d92d717ecb292520ee487332b22cd14b0e5fe57ebfb54ebe"} Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.276422 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a39ba9373a1c8330d92d717ecb292520ee487332b22cd14b0e5fe57ebfb54ebe" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.277071 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521440-crl2x" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.292315 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h6gj\" (UniqueName: \"kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj\") pod \"redhat-operators-s76xp\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.292455 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.293089 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.298843 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.299134 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.302201 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.309236 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.312963 4791 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt865 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.313011 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-rt865" podUID="0522c983-dae6-41ca-807a-ff45912a0024" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.313417 4791 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt865 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.313588 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt865" podUID="0522c983-dae6-41ca-807a-ff45912a0024" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.350801 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.427858 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.435153 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.435339 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.479083 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.479233 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.479413 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.479544 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngk4h\" (UniqueName: \"kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.479713 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.580672 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngk4h\" (UniqueName: \"kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.580741 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.580789 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.580816 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.580853 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.581061 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.581527 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.581518 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.605005 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.632772 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngk4h\" (UniqueName: \"kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h\") pod \"redhat-operators-bj8pc\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: E0217 00:08:14.635723 4791 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509fdefa_10b0_4752_b844_843ed9e7106d.slice/crio-conmon-df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509fdefa_10b0_4752_b844_843ed9e7106d.slice/crio-df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98.scope\": RecentStats: unable to find data in memory cache]" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.666534 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.673764 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.679325 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:14 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:14 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:14 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.679384 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.686362 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:08:14 crc kubenswrapper[4791]: W0217 00:08:14.721242 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e6c03f6_847b_402c_bfde_6dd30870b907.slice/crio-d67dd9afc803b2b8cf537c2d990677521dc5d35daf8d8201c4e1dd9fc3670d22 WatchSource:0}: Error finding container d67dd9afc803b2b8cf537c2d990677521dc5d35daf8d8201c4e1dd9fc3670d22: Status 404 returned error can't find the container with id d67dd9afc803b2b8cf537c2d990677521dc5d35daf8d8201c4e1dd9fc3670d22 Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.760506 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:14 crc kubenswrapper[4791]: I0217 00:08:14.940495 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.019961 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.027407 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:08:15 crc kubenswrapper[4791]: W0217 00:08:15.065280 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc437e64_8eee_418b_83d2_f79578cec0fe.slice/crio-66dcd55652a7b7d244cc261ad14bf94f214d23f47f292c66da7aede063b2653b WatchSource:0}: Error finding container 66dcd55652a7b7d244cc261ad14bf94f214d23f47f292c66da7aede063b2653b: Status 404 returned error can't find the container with id 66dcd55652a7b7d244cc261ad14bf94f214d23f47f292c66da7aede063b2653b Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.285806 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerStarted","Data":"4c3e1e8da3848cbfda653b16270111f1411ce1e227ddf3a4156d7065874d5fdb"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.285860 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerStarted","Data":"66dcd55652a7b7d244cc261ad14bf94f214d23f47f292c66da7aede063b2653b"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.290017 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28abe80d-37ae-45f7-abab-5bc2150e038e","Type":"ContainerStarted","Data":"dd5583159b97d008658f84cf600c20f343a255c6d03a482c074c8e01afed6de8"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.294172 4791 generic.go:334] "Generic (PLEG): container finished" podID="8b169853-1972-4cb9-9a80-159b0b3456fa" containerID="c3c101c94bb03448c1b4eb251a0fab2366de8d1d9d41e50fd26d9559d10d5bd0" exitCode=0 Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.294431 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8b169853-1972-4cb9-9a80-159b0b3456fa","Type":"ContainerDied","Data":"c3c101c94bb03448c1b4eb251a0fab2366de8d1d9d41e50fd26d9559d10d5bd0"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.303157 4791 generic.go:334] "Generic (PLEG): container finished" podID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerID="ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8" exitCode=0 Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.303236 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerDied","Data":"ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.303284 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerStarted","Data":"d67dd9afc803b2b8cf537c2d990677521dc5d35daf8d8201c4e1dd9fc3670d22"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.311647 4791 generic.go:334] "Generic (PLEG): container finished" podID="509fdefa-10b0-4752-b844-843ed9e7106d" containerID="df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98" exitCode=0 Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.311687 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerDied","Data":"df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.311714 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerStarted","Data":"10bc712141e232e5d7064d897763d2f2a15a17f24973eb264bc82a8e5f9eb39a"} Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.673947 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.684601 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:15 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:15 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:15 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.684676 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.710879 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access\") pod \"8b169853-1972-4cb9-9a80-159b0b3456fa\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.711035 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir\") pod \"8b169853-1972-4cb9-9a80-159b0b3456fa\" (UID: \"8b169853-1972-4cb9-9a80-159b0b3456fa\") " Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.711352 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8b169853-1972-4cb9-9a80-159b0b3456fa" (UID: "8b169853-1972-4cb9-9a80-159b0b3456fa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.716046 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8b169853-1972-4cb9-9a80-159b0b3456fa" (UID: "8b169853-1972-4cb9-9a80-159b0b3456fa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.811741 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b169853-1972-4cb9-9a80-159b0b3456fa-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:15 crc kubenswrapper[4791]: I0217 00:08:15.811782 4791 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b169853-1972-4cb9-9a80-159b0b3456fa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.330103 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8b169853-1972-4cb9-9a80-159b0b3456fa","Type":"ContainerDied","Data":"6dac1b8347eda9b35ac38349a12edc6d86d8bfa6b545b17f2fe10256ba56df85"} Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.330150 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dac1b8347eda9b35ac38349a12edc6d86d8bfa6b545b17f2fe10256ba56df85" Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.330257 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.361007 4791 generic.go:334] "Generic (PLEG): container finished" podID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerID="4c3e1e8da3848cbfda653b16270111f1411ce1e227ddf3a4156d7065874d5fdb" exitCode=0 Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.361072 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerDied","Data":"4c3e1e8da3848cbfda653b16270111f1411ce1e227ddf3a4156d7065874d5fdb"} Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.364571 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28abe80d-37ae-45f7-abab-5bc2150e038e","Type":"ContainerStarted","Data":"47bb1560d306fcdd076335cbd39df3e7b038c57fdd7600efd790c8af4a167710"} Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.676080 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:16 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:16 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:16 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:16 crc kubenswrapper[4791]: I0217 00:08:16.676201 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.059237 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4chtt" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.395746 4791 generic.go:334] "Generic (PLEG): container finished" podID="28abe80d-37ae-45f7-abab-5bc2150e038e" containerID="47bb1560d306fcdd076335cbd39df3e7b038c57fdd7600efd790c8af4a167710" exitCode=0 Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.395791 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28abe80d-37ae-45f7-abab-5bc2150e038e","Type":"ContainerDied","Data":"47bb1560d306fcdd076335cbd39df3e7b038c57fdd7600efd790c8af4a167710"} Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.678364 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:17 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:17 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:17 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.678422 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.714492 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.754273 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir\") pod \"28abe80d-37ae-45f7-abab-5bc2150e038e\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.754372 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "28abe80d-37ae-45f7-abab-5bc2150e038e" (UID: "28abe80d-37ae-45f7-abab-5bc2150e038e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.754559 4791 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28abe80d-37ae-45f7-abab-5bc2150e038e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.855031 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access\") pod \"28abe80d-37ae-45f7-abab-5bc2150e038e\" (UID: \"28abe80d-37ae-45f7-abab-5bc2150e038e\") " Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.878585 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "28abe80d-37ae-45f7-abab-5bc2150e038e" (UID: "28abe80d-37ae-45f7-abab-5bc2150e038e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:17 crc kubenswrapper[4791]: I0217 00:08:17.957093 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28abe80d-37ae-45f7-abab-5bc2150e038e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:18 crc kubenswrapper[4791]: I0217 00:08:18.427452 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28abe80d-37ae-45f7-abab-5bc2150e038e","Type":"ContainerDied","Data":"dd5583159b97d008658f84cf600c20f343a255c6d03a482c074c8e01afed6de8"} Feb 17 00:08:18 crc kubenswrapper[4791]: I0217 00:08:18.427495 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd5583159b97d008658f84cf600c20f343a255c6d03a482c074c8e01afed6de8" Feb 17 00:08:18 crc kubenswrapper[4791]: I0217 00:08:18.427511 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 00:08:18 crc kubenswrapper[4791]: I0217 00:08:18.689583 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:18 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:18 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:18 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:18 crc kubenswrapper[4791]: I0217 00:08:18.689652 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:19 crc kubenswrapper[4791]: I0217 00:08:19.675550 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:19 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:19 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:19 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:19 crc kubenswrapper[4791]: I0217 00:08:19.675614 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:20 crc kubenswrapper[4791]: I0217 00:08:20.677212 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:20 crc kubenswrapper[4791]: [-]has-synced failed: reason withheld Feb 17 00:08:20 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:20 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:20 crc kubenswrapper[4791]: I0217 00:08:20.677606 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:21 crc kubenswrapper[4791]: I0217 00:08:21.676730 4791 patch_prober.go:28] interesting pod/router-default-5444994796-kt8q6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 00:08:21 crc kubenswrapper[4791]: [+]has-synced ok Feb 17 00:08:21 crc kubenswrapper[4791]: [+]process-running ok Feb 17 00:08:21 crc kubenswrapper[4791]: healthz check failed Feb 17 00:08:21 crc kubenswrapper[4791]: I0217 00:08:21.676795 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kt8q6" podUID="459f3992-b770-44d7-9ecc-0ae8a228134f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 00:08:22 crc kubenswrapper[4791]: I0217 00:08:22.676782 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:22 crc kubenswrapper[4791]: I0217 00:08:22.679343 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-kt8q6" Feb 17 00:08:24 crc kubenswrapper[4791]: I0217 00:08:24.135992 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:24 crc kubenswrapper[4791]: I0217 00:08:24.140140 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-frmbv" Feb 17 00:08:24 crc kubenswrapper[4791]: I0217 00:08:24.315697 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-rt865" Feb 17 00:08:24 crc kubenswrapper[4791]: I0217 00:08:24.972938 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:08:24 crc kubenswrapper[4791]: I0217 00:08:24.972999 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:08:25 crc kubenswrapper[4791]: I0217 00:08:25.692063 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:08:25 crc kubenswrapper[4791]: I0217 00:08:25.699862 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d97cf45-2324-494c-839f-6f264eba3828-metrics-certs\") pod \"network-metrics-daemon-6x28n\" (UID: \"1d97cf45-2324-494c-839f-6f264eba3828\") " pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:08:25 crc kubenswrapper[4791]: I0217 00:08:25.848825 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6x28n" Feb 17 00:08:27 crc kubenswrapper[4791]: I0217 00:08:27.491873 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6x28n"] Feb 17 00:08:31 crc kubenswrapper[4791]: I0217 00:08:31.104133 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:08:32 crc kubenswrapper[4791]: W0217 00:08:32.104179 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d97cf45_2324_494c_839f_6f264eba3828.slice/crio-2593a58bff4cf0c8b37c180a661d10df1583c5c2555073df032a3a53ba8b392e WatchSource:0}: Error finding container 2593a58bff4cf0c8b37c180a661d10df1583c5c2555073df032a3a53ba8b392e: Status 404 returned error can't find the container with id 2593a58bff4cf0c8b37c180a661d10df1583c5c2555073df032a3a53ba8b392e Feb 17 00:08:32 crc kubenswrapper[4791]: I0217 00:08:32.523563 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6x28n" event={"ID":"1d97cf45-2324-494c-839f-6f264eba3828","Type":"ContainerStarted","Data":"2593a58bff4cf0c8b37c180a661d10df1583c5c2555073df032a3a53ba8b392e"} Feb 17 00:08:38 crc kubenswrapper[4791]: I0217 00:08:38.553518 4791 generic.go:334] "Generic (PLEG): container finished" podID="94401a93-55c7-4e8b-83f7-dc27a876f335" containerID="1bf210069f01dcf3433075dd8a895405951d971de359016b4eb9aa868416c26a" exitCode=0 Feb 17 00:08:38 crc kubenswrapper[4791]: I0217 00:08:38.553618 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29521440-k6f7k" event={"ID":"94401a93-55c7-4e8b-83f7-dc27a876f335","Type":"ContainerDied","Data":"1bf210069f01dcf3433075dd8a895405951d971de359016b4eb9aa868416c26a"} Feb 17 00:08:41 crc kubenswrapper[4791]: I0217 00:08:41.247284 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 00:08:42 crc kubenswrapper[4791]: E0217 00:08:42.747602 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 00:08:42 crc kubenswrapper[4791]: E0217 00:08:42.749691 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndp5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tt7zf_openshift-marketplace(9895217e-9934-4f80-a583-98842d597690): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 00:08:42 crc kubenswrapper[4791]: E0217 00:08:42.751063 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tt7zf" podUID="9895217e-9934-4f80-a583-98842d597690" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.770233 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tt7zf" podUID="9895217e-9934-4f80-a583-98842d597690" Feb 17 00:08:43 crc kubenswrapper[4791]: I0217 00:08:43.812484 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.876206 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.876332 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2txwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-h66xr_openshift-marketplace(db1caaaf-7e8b-405c-97ff-7c507f068688): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.877435 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-h66xr" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.919036 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.919200 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rq9sd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cgmd4_openshift-marketplace(48855520-658c-4579-a867-7e984bce56c7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.921001 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cgmd4" podUID="48855520-658c-4579-a867-7e984bce56c7" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.937567 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.937692 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8hwtv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-twq6q_openshift-marketplace(509fdefa-10b0-4752-b844-843ed9e7106d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 00:08:43 crc kubenswrapper[4791]: E0217 00:08:43.939007 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-twq6q" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" Feb 17 00:08:43 crc kubenswrapper[4791]: I0217 00:08:43.948034 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca\") pod \"94401a93-55c7-4e8b-83f7-dc27a876f335\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " Feb 17 00:08:43 crc kubenswrapper[4791]: I0217 00:08:43.948173 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgt9z\" (UniqueName: \"kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z\") pod \"94401a93-55c7-4e8b-83f7-dc27a876f335\" (UID: \"94401a93-55c7-4e8b-83f7-dc27a876f335\") " Feb 17 00:08:43 crc kubenswrapper[4791]: I0217 00:08:43.949426 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca" (OuterVolumeSpecName: "serviceca") pod "94401a93-55c7-4e8b-83f7-dc27a876f335" (UID: "94401a93-55c7-4e8b-83f7-dc27a876f335"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:08:43 crc kubenswrapper[4791]: I0217 00:08:43.954574 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z" (OuterVolumeSpecName: "kube-api-access-cgt9z") pod "94401a93-55c7-4e8b-83f7-dc27a876f335" (UID: "94401a93-55c7-4e8b-83f7-dc27a876f335"). InnerVolumeSpecName "kube-api-access-cgt9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.049447 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgt9z\" (UniqueName: \"kubernetes.io/projected/94401a93-55c7-4e8b-83f7-dc27a876f335-kube-api-access-cgt9z\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.049479 4791 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94401a93-55c7-4e8b-83f7-dc27a876f335-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.587086 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerStarted","Data":"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.589809 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29521440-k6f7k" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.589809 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29521440-k6f7k" event={"ID":"94401a93-55c7-4e8b-83f7-dc27a876f335","Type":"ContainerDied","Data":"142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.590136 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="142ab3f00f5019c1be7909e2cdfb055edb77c1758b592a88fa5914ec63b0bda2" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.592000 4791 generic.go:334] "Generic (PLEG): container finished" podID="9e98ac61-7140-4c15-8c29-47676734a52d" containerID="b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533" exitCode=0 Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.592093 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerDied","Data":"b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.594635 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerStarted","Data":"08a68b5f0e7fdf02d507ba4fd7e4b03827c1f9b663dd4ab621eef2f40ab4ac2d"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.597044 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6x28n" event={"ID":"1d97cf45-2324-494c-839f-6f264eba3828","Type":"ContainerStarted","Data":"e4a107bdf67b5b0b3b7b726d00831fb2f524244233c4f8ac2f8ddc8c2767337e"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.597088 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6x28n" event={"ID":"1d97cf45-2324-494c-839f-6f264eba3828","Type":"ContainerStarted","Data":"0607d177a5164d672755abaafdf40881feea71c0f29f6a2d4ff504d3445deaf4"} Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.611741 4791 generic.go:334] "Generic (PLEG): container finished" podID="f04d6e19-5c11-4527-8a49-3208098d2575" containerID="2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2" exitCode=0 Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.611959 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerDied","Data":"2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2"} Feb 17 00:08:44 crc kubenswrapper[4791]: E0217 00:08:44.613719 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-h66xr" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" Feb 17 00:08:44 crc kubenswrapper[4791]: E0217 00:08:44.614886 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-twq6q" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" Feb 17 00:08:44 crc kubenswrapper[4791]: E0217 00:08:44.614994 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cgmd4" podUID="48855520-658c-4579-a867-7e984bce56c7" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.668223 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r2rv7" Feb 17 00:08:44 crc kubenswrapper[4791]: I0217 00:08:44.704447 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6x28n" podStartSLOduration=162.704406689 podStartE2EDuration="2m42.704406689s" podCreationTimestamp="2026-02-17 00:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:44.697339602 +0000 UTC m=+182.176852139" watchObservedRunningTime="2026-02-17 00:08:44.704406689 +0000 UTC m=+182.183919226" Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.617692 4791 generic.go:334] "Generic (PLEG): container finished" podID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerID="08a68b5f0e7fdf02d507ba4fd7e4b03827c1f9b663dd4ab621eef2f40ab4ac2d" exitCode=0 Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.617747 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerDied","Data":"08a68b5f0e7fdf02d507ba4fd7e4b03827c1f9b663dd4ab621eef2f40ab4ac2d"} Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.623820 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerStarted","Data":"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5"} Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.628925 4791 generic.go:334] "Generic (PLEG): container finished" podID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerID="f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5" exitCode=0 Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.628973 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerDied","Data":"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5"} Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.634799 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerStarted","Data":"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e"} Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.714081 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b9dcp" podStartSLOduration=2.894046635 podStartE2EDuration="34.714059251s" podCreationTimestamp="2026-02-17 00:08:11 +0000 UTC" firstStartedPulling="2026-02-17 00:08:13.191030832 +0000 UTC m=+150.670543359" lastFinishedPulling="2026-02-17 00:08:45.011043448 +0000 UTC m=+182.490555975" observedRunningTime="2026-02-17 00:08:45.691726923 +0000 UTC m=+183.171239450" watchObservedRunningTime="2026-02-17 00:08:45.714059251 +0000 UTC m=+183.193571798" Feb 17 00:08:45 crc kubenswrapper[4791]: I0217 00:08:45.714935 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8xbcp" podStartSLOduration=1.6209202120000001 podStartE2EDuration="34.714926579s" podCreationTimestamp="2026-02-17 00:08:11 +0000 UTC" firstStartedPulling="2026-02-17 00:08:12.121635119 +0000 UTC m=+149.601147646" lastFinishedPulling="2026-02-17 00:08:45.215641476 +0000 UTC m=+182.695154013" observedRunningTime="2026-02-17 00:08:45.713229455 +0000 UTC m=+183.192742002" watchObservedRunningTime="2026-02-17 00:08:45.714926579 +0000 UTC m=+183.194439126" Feb 17 00:08:46 crc kubenswrapper[4791]: I0217 00:08:46.648236 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerStarted","Data":"d649ea334bf454488556fe2c70b06ff12d86f88ff25ffa7b4f00f581346ea820"} Feb 17 00:08:46 crc kubenswrapper[4791]: I0217 00:08:46.652261 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerStarted","Data":"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1"} Feb 17 00:08:46 crc kubenswrapper[4791]: I0217 00:08:46.668322 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bj8pc" podStartSLOduration=2.974785819 podStartE2EDuration="32.668301103s" podCreationTimestamp="2026-02-17 00:08:14 +0000 UTC" firstStartedPulling="2026-02-17 00:08:16.362303486 +0000 UTC m=+153.841816013" lastFinishedPulling="2026-02-17 00:08:46.05581877 +0000 UTC m=+183.535331297" observedRunningTime="2026-02-17 00:08:46.664347216 +0000 UTC m=+184.143859743" watchObservedRunningTime="2026-02-17 00:08:46.668301103 +0000 UTC m=+184.147813630" Feb 17 00:08:46 crc kubenswrapper[4791]: I0217 00:08:46.690904 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s76xp" podStartSLOduration=1.965649642 podStartE2EDuration="32.690888149s" podCreationTimestamp="2026-02-17 00:08:14 +0000 UTC" firstStartedPulling="2026-02-17 00:08:15.30715205 +0000 UTC m=+152.786664577" lastFinishedPulling="2026-02-17 00:08:46.032390567 +0000 UTC m=+183.511903084" observedRunningTime="2026-02-17 00:08:46.688620626 +0000 UTC m=+184.168133153" watchObservedRunningTime="2026-02-17 00:08:46.690888149 +0000 UTC m=+184.170400676" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.348323 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.348668 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482170 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 00:08:51 crc kubenswrapper[4791]: E0217 00:08:51.482436 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28abe80d-37ae-45f7-abab-5bc2150e038e" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482451 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="28abe80d-37ae-45f7-abab-5bc2150e038e" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: E0217 00:08:51.482468 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94401a93-55c7-4e8b-83f7-dc27a876f335" containerName="image-pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482477 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="94401a93-55c7-4e8b-83f7-dc27a876f335" containerName="image-pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: E0217 00:08:51.482498 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b169853-1972-4cb9-9a80-159b0b3456fa" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482508 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b169853-1972-4cb9-9a80-159b0b3456fa" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482637 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="94401a93-55c7-4e8b-83f7-dc27a876f335" containerName="image-pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482652 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="28abe80d-37ae-45f7-abab-5bc2150e038e" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.482664 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b169853-1972-4cb9-9a80-159b0b3456fa" containerName="pruner" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.483085 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.485057 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.485877 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.494554 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.558693 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.653193 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.653773 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.722998 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.754755 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.754808 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.754874 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.780671 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.797265 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.812329 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.812363 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:51 crc kubenswrapper[4791]: I0217 00:08:51.875361 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:52 crc kubenswrapper[4791]: I0217 00:08:52.062451 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 00:08:52 crc kubenswrapper[4791]: W0217 00:08:52.076902 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode0995bd5_bc3f_4011_b6ca_ee70c4d84798.slice/crio-97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d WatchSource:0}: Error finding container 97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d: Status 404 returned error can't find the container with id 97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d Feb 17 00:08:52 crc kubenswrapper[4791]: I0217 00:08:52.691356 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e0995bd5-bc3f-4011-b6ca-ee70c4d84798","Type":"ContainerStarted","Data":"97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d"} Feb 17 00:08:52 crc kubenswrapper[4791]: I0217 00:08:52.728328 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:53 crc kubenswrapper[4791]: I0217 00:08:53.157232 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:08:53 crc kubenswrapper[4791]: I0217 00:08:53.629402 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:53 crc kubenswrapper[4791]: I0217 00:08:53.697819 4791 generic.go:334] "Generic (PLEG): container finished" podID="e0995bd5-bc3f-4011-b6ca-ee70c4d84798" containerID="3fa5b59208b65017a2be0da1d6bb8489fd1d5ca69ba90c98d7179b98a37399b3" exitCode=0 Feb 17 00:08:53 crc kubenswrapper[4791]: I0217 00:08:53.697927 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e0995bd5-bc3f-4011-b6ca-ee70c4d84798","Type":"ContainerDied","Data":"3fa5b59208b65017a2be0da1d6bb8489fd1d5ca69ba90c98d7179b98a37399b3"} Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.352043 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.352124 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.406094 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.703920 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b9dcp" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="registry-server" containerID="cri-o://5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e" gracePeriod=2 Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.748251 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.761258 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.761478 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.807513 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.956709 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.972780 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:08:54 crc kubenswrapper[4791]: I0217 00:08:54.972856 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.060694 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.133787 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access\") pod \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.133884 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir\") pod \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\" (UID: \"e0995bd5-bc3f-4011-b6ca-ee70c4d84798\") " Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.134006 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e0995bd5-bc3f-4011-b6ca-ee70c4d84798" (UID: "e0995bd5-bc3f-4011-b6ca-ee70c4d84798"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.134171 4791 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.138396 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e0995bd5-bc3f-4011-b6ca-ee70c4d84798" (UID: "e0995bd5-bc3f-4011-b6ca-ee70c4d84798"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.237046 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content\") pod \"9e98ac61-7140-4c15-8c29-47676734a52d\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.237214 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities\") pod \"9e98ac61-7140-4c15-8c29-47676734a52d\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.237665 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6fd4\" (UniqueName: \"kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4\") pod \"9e98ac61-7140-4c15-8c29-47676734a52d\" (UID: \"9e98ac61-7140-4c15-8c29-47676734a52d\") " Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.237847 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities" (OuterVolumeSpecName: "utilities") pod "9e98ac61-7140-4c15-8c29-47676734a52d" (UID: "9e98ac61-7140-4c15-8c29-47676734a52d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.240770 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.240810 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0995bd5-bc3f-4011-b6ca-ee70c4d84798-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.256480 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4" (OuterVolumeSpecName: "kube-api-access-f6fd4") pod "9e98ac61-7140-4c15-8c29-47676734a52d" (UID: "9e98ac61-7140-4c15-8c29-47676734a52d"). InnerVolumeSpecName "kube-api-access-f6fd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.322968 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e98ac61-7140-4c15-8c29-47676734a52d" (UID: "9e98ac61-7140-4c15-8c29-47676734a52d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.342188 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6fd4\" (UniqueName: \"kubernetes.io/projected/9e98ac61-7140-4c15-8c29-47676734a52d-kube-api-access-f6fd4\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.342216 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e98ac61-7140-4c15-8c29-47676734a52d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.713591 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e0995bd5-bc3f-4011-b6ca-ee70c4d84798","Type":"ContainerDied","Data":"97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d"} Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.713650 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c89836b71ff7d21a5271a0fc522b64b958612bc0508d95b9f8ef20888e934d" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.713850 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.722337 4791 generic.go:334] "Generic (PLEG): container finished" podID="9e98ac61-7140-4c15-8c29-47676734a52d" containerID="5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e" exitCode=0 Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.723225 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9dcp" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.723253 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerDied","Data":"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e"} Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.723310 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9dcp" event={"ID":"9e98ac61-7140-4c15-8c29-47676734a52d","Type":"ContainerDied","Data":"b50d297ceb2781d37cf93aa6962f7a2c6eaee4f855e23a5380418bef8dad3d16"} Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.723334 4791 scope.go:117] "RemoveContainer" containerID="5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.743024 4791 scope.go:117] "RemoveContainer" containerID="b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.765815 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.768462 4791 scope.go:117] "RemoveContainer" containerID="d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.768131 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.770865 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b9dcp"] Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.786126 4791 scope.go:117] "RemoveContainer" containerID="5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e" Feb 17 00:08:55 crc kubenswrapper[4791]: E0217 00:08:55.786436 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e\": container with ID starting with 5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e not found: ID does not exist" containerID="5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.786467 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e"} err="failed to get container status \"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e\": rpc error: code = NotFound desc = could not find container \"5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e\": container with ID starting with 5f1f15c58396e8a0d372d8b6e3158f29e686530d20a25a9aa3576e031d69ab9e not found: ID does not exist" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.786502 4791 scope.go:117] "RemoveContainer" containerID="b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533" Feb 17 00:08:55 crc kubenswrapper[4791]: E0217 00:08:55.787646 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533\": container with ID starting with b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533 not found: ID does not exist" containerID="b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.787692 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533"} err="failed to get container status \"b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533\": rpc error: code = NotFound desc = could not find container \"b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533\": container with ID starting with b0d6d012f61c85279eddffc3b1f7adb9dc86b4b4ba7e62c78a6a02431b529533 not found: ID does not exist" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.787719 4791 scope.go:117] "RemoveContainer" containerID="d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7" Feb 17 00:08:55 crc kubenswrapper[4791]: E0217 00:08:55.788161 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7\": container with ID starting with d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7 not found: ID does not exist" containerID="d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7" Feb 17 00:08:55 crc kubenswrapper[4791]: I0217 00:08:55.788190 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7"} err="failed to get container status \"d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7\": rpc error: code = NotFound desc = could not find container \"d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7\": container with ID starting with d006ea87acb43994528a5f501fff92ed303a832231395907903d1e69a1ba44f7 not found: ID does not exist" Feb 17 00:08:56 crc kubenswrapper[4791]: I0217 00:08:56.632680 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.228184 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" path="/var/lib/kubelet/pods/9e98ac61-7140-4c15-8c29-47676734a52d/volumes" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479471 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 00:08:57 crc kubenswrapper[4791]: E0217 00:08:57.479679 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="extract-content" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479689 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="extract-content" Feb 17 00:08:57 crc kubenswrapper[4791]: E0217 00:08:57.479699 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="registry-server" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479704 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="registry-server" Feb 17 00:08:57 crc kubenswrapper[4791]: E0217 00:08:57.479719 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="extract-utilities" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479726 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="extract-utilities" Feb 17 00:08:57 crc kubenswrapper[4791]: E0217 00:08:57.479736 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0995bd5-bc3f-4011-b6ca-ee70c4d84798" containerName="pruner" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479742 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0995bd5-bc3f-4011-b6ca-ee70c4d84798" containerName="pruner" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479887 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e98ac61-7140-4c15-8c29-47676734a52d" containerName="registry-server" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.479902 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0995bd5-bc3f-4011-b6ca-ee70c4d84798" containerName="pruner" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.480263 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.482696 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.482967 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.488668 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.675448 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.675489 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.675513 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.733439 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bj8pc" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="registry-server" containerID="cri-o://d649ea334bf454488556fe2c70b06ff12d86f88ff25ffa7b4f00f581346ea820" gracePeriod=2 Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.776336 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.776392 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.776411 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.776572 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.776573 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.806048 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access\") pod \"installer-9-crc\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:57 crc kubenswrapper[4791]: I0217 00:08:57.809248 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:08:58 crc kubenswrapper[4791]: I0217 00:08:58.740459 4791 generic.go:334] "Generic (PLEG): container finished" podID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerID="d649ea334bf454488556fe2c70b06ff12d86f88ff25ffa7b4f00f581346ea820" exitCode=0 Feb 17 00:08:58 crc kubenswrapper[4791]: I0217 00:08:58.740526 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerDied","Data":"d649ea334bf454488556fe2c70b06ff12d86f88ff25ffa7b4f00f581346ea820"} Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.281948 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.299515 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 00:08:59 crc kubenswrapper[4791]: W0217 00:08:59.310538 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod45d65fd1_6366_4ed0_bc40_10d5418435ea.slice/crio-e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd WatchSource:0}: Error finding container e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd: Status 404 returned error can't find the container with id e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.411406 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content\") pod \"fc437e64-8eee-418b-83d2-f79578cec0fe\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.411759 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngk4h\" (UniqueName: \"kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h\") pod \"fc437e64-8eee-418b-83d2-f79578cec0fe\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.411811 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities\") pod \"fc437e64-8eee-418b-83d2-f79578cec0fe\" (UID: \"fc437e64-8eee-418b-83d2-f79578cec0fe\") " Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.412636 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities" (OuterVolumeSpecName: "utilities") pod "fc437e64-8eee-418b-83d2-f79578cec0fe" (UID: "fc437e64-8eee-418b-83d2-f79578cec0fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.416900 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h" (OuterVolumeSpecName: "kube-api-access-ngk4h") pod "fc437e64-8eee-418b-83d2-f79578cec0fe" (UID: "fc437e64-8eee-418b-83d2-f79578cec0fe"). InnerVolumeSpecName "kube-api-access-ngk4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.513948 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngk4h\" (UniqueName: \"kubernetes.io/projected/fc437e64-8eee-418b-83d2-f79578cec0fe-kube-api-access-ngk4h\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.513978 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.533647 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc437e64-8eee-418b-83d2-f79578cec0fe" (UID: "fc437e64-8eee-418b-83d2-f79578cec0fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.615003 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc437e64-8eee-418b-83d2-f79578cec0fe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.749182 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj8pc" event={"ID":"fc437e64-8eee-418b-83d2-f79578cec0fe","Type":"ContainerDied","Data":"66dcd55652a7b7d244cc261ad14bf94f214d23f47f292c66da7aede063b2653b"} Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.749204 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj8pc" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.749246 4791 scope.go:117] "RemoveContainer" containerID="d649ea334bf454488556fe2c70b06ff12d86f88ff25ffa7b4f00f581346ea820" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.754649 4791 generic.go:334] "Generic (PLEG): container finished" podID="48855520-658c-4579-a867-7e984bce56c7" containerID="0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61" exitCode=0 Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.754675 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerDied","Data":"0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61"} Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.757299 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45d65fd1-6366-4ed0-bc40-10d5418435ea","Type":"ContainerStarted","Data":"625e883c4175a7bb832ef122a4de927b44b76c4d7352658a36f562332f5bdbaa"} Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.757354 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45d65fd1-6366-4ed0-bc40-10d5418435ea","Type":"ContainerStarted","Data":"e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd"} Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.768276 4791 scope.go:117] "RemoveContainer" containerID="08a68b5f0e7fdf02d507ba4fd7e4b03827c1f9b663dd4ab621eef2f40ab4ac2d" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.839778 4791 scope.go:117] "RemoveContainer" containerID="4c3e1e8da3848cbfda653b16270111f1411ce1e227ddf3a4156d7065874d5fdb" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.849748 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.8495815220000003 podStartE2EDuration="2.849581522s" podCreationTimestamp="2026-02-17 00:08:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:08:59.846327387 +0000 UTC m=+197.325839924" watchObservedRunningTime="2026-02-17 00:08:59.849581522 +0000 UTC m=+197.329094049" Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.866009 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:08:59 crc kubenswrapper[4791]: I0217 00:08:59.870276 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bj8pc"] Feb 17 00:09:00 crc kubenswrapper[4791]: I0217 00:09:00.764640 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerStarted","Data":"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01"} Feb 17 00:09:00 crc kubenswrapper[4791]: I0217 00:09:00.767305 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerStarted","Data":"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a"} Feb 17 00:09:00 crc kubenswrapper[4791]: I0217 00:09:00.781719 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cgmd4" podStartSLOduration=3.73993951 podStartE2EDuration="50.781701661s" podCreationTimestamp="2026-02-17 00:08:10 +0000 UTC" firstStartedPulling="2026-02-17 00:08:13.212922256 +0000 UTC m=+150.692434793" lastFinishedPulling="2026-02-17 00:09:00.254684417 +0000 UTC m=+197.734196944" observedRunningTime="2026-02-17 00:09:00.780660499 +0000 UTC m=+198.260173026" watchObservedRunningTime="2026-02-17 00:09:00.781701661 +0000 UTC m=+198.261214188" Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.209495 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.209551 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.226209 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" path="/var/lib/kubelet/pods/fc437e64-8eee-418b-83d2-f79578cec0fe/volumes" Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.773460 4791 generic.go:334] "Generic (PLEG): container finished" podID="9895217e-9934-4f80-a583-98842d597690" containerID="50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a" exitCode=0 Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.773527 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerDied","Data":"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a"} Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.775694 4791 generic.go:334] "Generic (PLEG): container finished" podID="509fdefa-10b0-4752-b844-843ed9e7106d" containerID="5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7" exitCode=0 Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.775743 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerDied","Data":"5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7"} Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.780280 4791 generic.go:334] "Generic (PLEG): container finished" podID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerID="517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943" exitCode=0 Feb 17 00:09:01 crc kubenswrapper[4791]: I0217 00:09:01.780353 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerDied","Data":"517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943"} Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.256232 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cgmd4" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="registry-server" probeResult="failure" output=< Feb 17 00:09:02 crc kubenswrapper[4791]: timeout: failed to connect service ":50051" within 1s Feb 17 00:09:02 crc kubenswrapper[4791]: > Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.787973 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerStarted","Data":"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034"} Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.791028 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerStarted","Data":"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3"} Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.793529 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerStarted","Data":"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a"} Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.810453 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tt7zf" podStartSLOduration=2.77939956 podStartE2EDuration="51.810442471s" podCreationTimestamp="2026-02-17 00:08:11 +0000 UTC" firstStartedPulling="2026-02-17 00:08:13.214274039 +0000 UTC m=+150.693786566" lastFinishedPulling="2026-02-17 00:09:02.24531695 +0000 UTC m=+199.724829477" observedRunningTime="2026-02-17 00:09:02.808714935 +0000 UTC m=+200.288227462" watchObservedRunningTime="2026-02-17 00:09:02.810442471 +0000 UTC m=+200.289954998" Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.822548 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-twq6q" podStartSLOduration=2.631955784 podStartE2EDuration="49.822529159s" podCreationTimestamp="2026-02-17 00:08:13 +0000 UTC" firstStartedPulling="2026-02-17 00:08:15.313141033 +0000 UTC m=+152.792653560" lastFinishedPulling="2026-02-17 00:09:02.503714408 +0000 UTC m=+199.983226935" observedRunningTime="2026-02-17 00:09:02.821226467 +0000 UTC m=+200.300739004" watchObservedRunningTime="2026-02-17 00:09:02.822529159 +0000 UTC m=+200.302041686" Feb 17 00:09:02 crc kubenswrapper[4791]: I0217 00:09:02.844496 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h66xr" podStartSLOduration=1.6676257159999999 podStartE2EDuration="49.844478044s" podCreationTimestamp="2026-02-17 00:08:13 +0000 UTC" firstStartedPulling="2026-02-17 00:08:14.247102876 +0000 UTC m=+151.726615393" lastFinishedPulling="2026-02-17 00:09:02.423955194 +0000 UTC m=+199.903467721" observedRunningTime="2026-02-17 00:09:02.841302353 +0000 UTC m=+200.320814880" watchObservedRunningTime="2026-02-17 00:09:02.844478044 +0000 UTC m=+200.323990581" Feb 17 00:09:03 crc kubenswrapper[4791]: I0217 00:09:03.355858 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:09:03 crc kubenswrapper[4791]: I0217 00:09:03.355909 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:09:03 crc kubenswrapper[4791]: I0217 00:09:03.847956 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:03 crc kubenswrapper[4791]: I0217 00:09:03.848034 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:04 crc kubenswrapper[4791]: I0217 00:09:04.399785 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-h66xr" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="registry-server" probeResult="failure" output=< Feb 17 00:09:04 crc kubenswrapper[4791]: timeout: failed to connect service ":50051" within 1s Feb 17 00:09:04 crc kubenswrapper[4791]: > Feb 17 00:09:04 crc kubenswrapper[4791]: I0217 00:09:04.893156 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-twq6q" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="registry-server" probeResult="failure" output=< Feb 17 00:09:04 crc kubenswrapper[4791]: timeout: failed to connect service ":50051" within 1s Feb 17 00:09:04 crc kubenswrapper[4791]: > Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.258329 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.307618 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.554170 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.554218 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.601495 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:11 crc kubenswrapper[4791]: I0217 00:09:11.916045 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.406156 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.452310 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.459383 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.862859 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tt7zf" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="registry-server" containerID="cri-o://689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034" gracePeriod=2 Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.889819 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:13 crc kubenswrapper[4791]: I0217 00:09:13.925375 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.231556 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.345049 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities\") pod \"9895217e-9934-4f80-a583-98842d597690\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.345093 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndp5p\" (UniqueName: \"kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p\") pod \"9895217e-9934-4f80-a583-98842d597690\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.345179 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content\") pod \"9895217e-9934-4f80-a583-98842d597690\" (UID: \"9895217e-9934-4f80-a583-98842d597690\") " Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.347480 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities" (OuterVolumeSpecName: "utilities") pod "9895217e-9934-4f80-a583-98842d597690" (UID: "9895217e-9934-4f80-a583-98842d597690"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.354634 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p" (OuterVolumeSpecName: "kube-api-access-ndp5p") pod "9895217e-9934-4f80-a583-98842d597690" (UID: "9895217e-9934-4f80-a583-98842d597690"). InnerVolumeSpecName "kube-api-access-ndp5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.413857 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9895217e-9934-4f80-a583-98842d597690" (UID: "9895217e-9934-4f80-a583-98842d597690"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.446935 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.446972 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9895217e-9934-4f80-a583-98842d597690-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.446983 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndp5p\" (UniqueName: \"kubernetes.io/projected/9895217e-9934-4f80-a583-98842d597690-kube-api-access-ndp5p\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.873233 4791 generic.go:334] "Generic (PLEG): container finished" podID="9895217e-9934-4f80-a583-98842d597690" containerID="689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034" exitCode=0 Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.873371 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tt7zf" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.873390 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerDied","Data":"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034"} Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.873868 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tt7zf" event={"ID":"9895217e-9934-4f80-a583-98842d597690","Type":"ContainerDied","Data":"977ab4aa20ed7b60a4814c6d6489439a907a290d464af852a9ad4af4039d6953"} Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.873899 4791 scope.go:117] "RemoveContainer" containerID="689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.897390 4791 scope.go:117] "RemoveContainer" containerID="50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.911523 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.918062 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tt7zf"] Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.936244 4791 scope.go:117] "RemoveContainer" containerID="b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.986150 4791 scope.go:117] "RemoveContainer" containerID="689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034" Feb 17 00:09:14 crc kubenswrapper[4791]: E0217 00:09:14.986644 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034\": container with ID starting with 689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034 not found: ID does not exist" containerID="689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.986682 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034"} err="failed to get container status \"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034\": rpc error: code = NotFound desc = could not find container \"689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034\": container with ID starting with 689a72338c06b9eae09991f0631b50e3df437c11c9c31b6df3759a41093c5034 not found: ID does not exist" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.986708 4791 scope.go:117] "RemoveContainer" containerID="50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a" Feb 17 00:09:14 crc kubenswrapper[4791]: E0217 00:09:14.987168 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a\": container with ID starting with 50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a not found: ID does not exist" containerID="50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.987221 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a"} err="failed to get container status \"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a\": rpc error: code = NotFound desc = could not find container \"50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a\": container with ID starting with 50fcf4873c5a6dd6daf12f726d1b182ebcd3d6006fd89bb5ee76ec514fb8ad9a not found: ID does not exist" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.987260 4791 scope.go:117] "RemoveContainer" containerID="b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33" Feb 17 00:09:14 crc kubenswrapper[4791]: E0217 00:09:14.987655 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33\": container with ID starting with b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33 not found: ID does not exist" containerID="b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33" Feb 17 00:09:14 crc kubenswrapper[4791]: I0217 00:09:14.987721 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33"} err="failed to get container status \"b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33\": rpc error: code = NotFound desc = could not find container \"b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33\": container with ID starting with b36899b8f5eaec69b28a571b963ca26fa0d5f68899482c023912c3245da10e33 not found: ID does not exist" Feb 17 00:09:15 crc kubenswrapper[4791]: I0217 00:09:15.227507 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9895217e-9934-4f80-a583-98842d597690" path="/var/lib/kubelet/pods/9895217e-9934-4f80-a583-98842d597690/volumes" Feb 17 00:09:15 crc kubenswrapper[4791]: I0217 00:09:15.663350 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:09:15 crc kubenswrapper[4791]: I0217 00:09:15.882688 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-twq6q" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="registry-server" containerID="cri-o://ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3" gracePeriod=2 Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.384550 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.472391 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities\") pod \"509fdefa-10b0-4752-b844-843ed9e7106d\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.472519 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content\") pod \"509fdefa-10b0-4752-b844-843ed9e7106d\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.472572 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hwtv\" (UniqueName: \"kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv\") pod \"509fdefa-10b0-4752-b844-843ed9e7106d\" (UID: \"509fdefa-10b0-4752-b844-843ed9e7106d\") " Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.473500 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities" (OuterVolumeSpecName: "utilities") pod "509fdefa-10b0-4752-b844-843ed9e7106d" (UID: "509fdefa-10b0-4752-b844-843ed9e7106d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.479411 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv" (OuterVolumeSpecName: "kube-api-access-8hwtv") pod "509fdefa-10b0-4752-b844-843ed9e7106d" (UID: "509fdefa-10b0-4752-b844-843ed9e7106d"). InnerVolumeSpecName "kube-api-access-8hwtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.523313 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "509fdefa-10b0-4752-b844-843ed9e7106d" (UID: "509fdefa-10b0-4752-b844-843ed9e7106d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.574315 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.574374 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hwtv\" (UniqueName: \"kubernetes.io/projected/509fdefa-10b0-4752-b844-843ed9e7106d-kube-api-access-8hwtv\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.574394 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509fdefa-10b0-4752-b844-843ed9e7106d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.892062 4791 generic.go:334] "Generic (PLEG): container finished" podID="509fdefa-10b0-4752-b844-843ed9e7106d" containerID="ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3" exitCode=0 Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.892143 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerDied","Data":"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3"} Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.892185 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twq6q" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.892207 4791 scope.go:117] "RemoveContainer" containerID="ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.892191 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twq6q" event={"ID":"509fdefa-10b0-4752-b844-843ed9e7106d","Type":"ContainerDied","Data":"10bc712141e232e5d7064d897763d2f2a15a17f24973eb264bc82a8e5f9eb39a"} Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.919695 4791 scope.go:117] "RemoveContainer" containerID="5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.948210 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.953289 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-twq6q"] Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.964016 4791 scope.go:117] "RemoveContainer" containerID="df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.982869 4791 scope.go:117] "RemoveContainer" containerID="ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3" Feb 17 00:09:16 crc kubenswrapper[4791]: E0217 00:09:16.983577 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3\": container with ID starting with ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3 not found: ID does not exist" containerID="ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.983783 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3"} err="failed to get container status \"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3\": rpc error: code = NotFound desc = could not find container \"ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3\": container with ID starting with ea65bc2d61ac85235a80ae8a27573effd372e25997a3f5303b609799ecc6b2f3 not found: ID does not exist" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.983925 4791 scope.go:117] "RemoveContainer" containerID="5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7" Feb 17 00:09:16 crc kubenswrapper[4791]: E0217 00:09:16.984552 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7\": container with ID starting with 5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7 not found: ID does not exist" containerID="5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.984617 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7"} err="failed to get container status \"5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7\": rpc error: code = NotFound desc = could not find container \"5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7\": container with ID starting with 5706adc95f93987fa96bb222e121f9a9ec4e44f5803df7b257f2595f2a56b0d7 not found: ID does not exist" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.984659 4791 scope.go:117] "RemoveContainer" containerID="df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98" Feb 17 00:09:16 crc kubenswrapper[4791]: E0217 00:09:16.985122 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98\": container with ID starting with df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98 not found: ID does not exist" containerID="df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98" Feb 17 00:09:16 crc kubenswrapper[4791]: I0217 00:09:16.985276 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98"} err="failed to get container status \"df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98\": rpc error: code = NotFound desc = could not find container \"df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98\": container with ID starting with df626c649c84f3f24620a23cf5b0e1d66e1248e9b1655b34e6e8054df1deef98 not found: ID does not exist" Feb 17 00:09:17 crc kubenswrapper[4791]: I0217 00:09:17.231287 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" path="/var/lib/kubelet/pods/509fdefa-10b0-4752-b844-843ed9e7106d/volumes" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.185930 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerName="oauth-openshift" containerID="cri-o://8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7" gracePeriod=15 Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.523752 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705090 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws66n\" (UniqueName: \"kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705149 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705194 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705209 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705231 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705273 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705294 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705309 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705333 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705354 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705372 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705408 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705435 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705455 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data\") pod \"c70fe9d3-348d-4bb8-89f7-21027041131a\" (UID: \"c70fe9d3-348d-4bb8-89f7-21027041131a\") " Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.705926 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.706295 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.709638 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.710021 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.710078 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.711806 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.712079 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.712271 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.712439 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.712629 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.717427 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n" (OuterVolumeSpecName: "kube-api-access-ws66n") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "kube-api-access-ws66n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.719861 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.720321 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.728673 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c70fe9d3-348d-4bb8-89f7-21027041131a" (UID: "c70fe9d3-348d-4bb8-89f7-21027041131a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806634 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806668 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806679 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806689 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws66n\" (UniqueName: \"kubernetes.io/projected/c70fe9d3-348d-4bb8-89f7-21027041131a-kube-api-access-ws66n\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806699 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806707 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806716 4791 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806726 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806736 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806745 4791 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806754 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806763 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806771 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.806780 4791 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c70fe9d3-348d-4bb8-89f7-21027041131a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.909182 4791 generic.go:334] "Generic (PLEG): container finished" podID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerID="8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7" exitCode=0 Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.909361 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.909396 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" event={"ID":"c70fe9d3-348d-4bb8-89f7-21027041131a","Type":"ContainerDied","Data":"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7"} Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.911200 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8zpf" event={"ID":"c70fe9d3-348d-4bb8-89f7-21027041131a","Type":"ContainerDied","Data":"bc92c2848641b389e30636fc84dbaa434604ffad03bf817e464c4e944f11c4ea"} Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.911251 4791 scope.go:117] "RemoveContainer" containerID="8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.943247 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.946412 4791 scope.go:117] "RemoveContainer" containerID="8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7" Feb 17 00:09:18 crc kubenswrapper[4791]: E0217 00:09:18.947143 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7\": container with ID starting with 8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7 not found: ID does not exist" containerID="8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.947227 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7"} err="failed to get container status \"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7\": rpc error: code = NotFound desc = could not find container \"8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7\": container with ID starting with 8f95a285c9f42fc1af78c5ca2e9c6d3afa217abceec616519705df06ca48c8b7 not found: ID does not exist" Feb 17 00:09:18 crc kubenswrapper[4791]: I0217 00:09:18.949301 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8zpf"] Feb 17 00:09:19 crc kubenswrapper[4791]: I0217 00:09:19.228918 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" path="/var/lib/kubelet/pods/c70fe9d3-348d-4bb8-89f7-21027041131a/volumes" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.607796 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79d78bfd77-v4r9g"] Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.608948 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerName="oauth-openshift" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609052 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerName="oauth-openshift" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609159 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609247 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609341 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609431 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609530 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609607 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609689 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609767 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609843 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.609914 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.609992 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610072 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="extract-content" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.610175 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610273 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.610351 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610420 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: E0217 00:09:24.610497 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610577 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="extract-utilities" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610778 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="9895217e-9934-4f80-a583-98842d597690" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610861 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70fe9d3-348d-4bb8-89f7-21027041131a" containerName="oauth-openshift" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610923 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc437e64-8eee-418b-83d2-f79578cec0fe" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.610984 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="509fdefa-10b0-4752-b844-843ed9e7106d" containerName="registry-server" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.611496 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.615796 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.615810 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.616035 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.616341 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.616371 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.616732 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.617514 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.617667 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.618552 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.618683 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.618787 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.619271 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d78bfd77-v4r9g"] Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.622701 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.627354 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.628757 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.633193 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798809 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798870 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798907 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798932 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-dir\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798951 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-login\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.798976 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799102 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-session\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799193 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k58nk\" (UniqueName: \"kubernetes.io/projected/1917d075-c5be-48b4-baa3-a25bc8a6655b-kube-api-access-k58nk\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799230 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799372 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799491 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799523 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799562 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-error\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.799594 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-policies\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900004 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900051 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900075 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-error\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900095 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-policies\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900132 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900147 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900171 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900195 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-dir\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900221 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-login\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900249 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900271 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-session\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900291 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58nk\" (UniqueName: \"kubernetes.io/projected/1917d075-c5be-48b4-baa3-a25bc8a6655b-kube-api-access-k58nk\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900315 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.900347 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.901022 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-dir\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.901294 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.901562 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.901717 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-audit-policies\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.901817 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.905883 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.906010 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.907310 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.907509 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-login\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.908451 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.908975 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-session\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.912851 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-user-template-error\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.918514 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58nk\" (UniqueName: \"kubernetes.io/projected/1917d075-c5be-48b4-baa3-a25bc8a6655b-kube-api-access-k58nk\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.919594 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1917d075-c5be-48b4-baa3-a25bc8a6655b-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d78bfd77-v4r9g\" (UID: \"1917d075-c5be-48b4-baa3-a25bc8a6655b\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.973381 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.973453 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.973506 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.974051 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.974186 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28" gracePeriod=600 Feb 17 00:09:24 crc kubenswrapper[4791]: I0217 00:09:24.979731 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.166690 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d78bfd77-v4r9g"] Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.954784 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" event={"ID":"1917d075-c5be-48b4-baa3-a25bc8a6655b","Type":"ContainerStarted","Data":"ee5a27a28e2f585a97fd3e167f824e5269f315e80111e803154c9780bee80889"} Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.955448 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.955469 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" event={"ID":"1917d075-c5be-48b4-baa3-a25bc8a6655b","Type":"ContainerStarted","Data":"60315c44462d35ce29a4625b1eb5cd20cf0f03f317abd9826169edbb52be1877"} Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.957323 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28" exitCode=0 Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.957423 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28"} Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.957613 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f"} Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.963132 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" Feb 17 00:09:25 crc kubenswrapper[4791]: I0217 00:09:25.982123 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79d78bfd77-v4r9g" podStartSLOduration=32.982085798 podStartE2EDuration="32.982085798s" podCreationTimestamp="2026-02-17 00:08:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:09:25.978611136 +0000 UTC m=+223.458123673" watchObservedRunningTime="2026-02-17 00:09:25.982085798 +0000 UTC m=+223.461598325" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.423229 4791 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424484 4791 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424722 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118" gracePeriod=15 Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424774 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424834 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0" gracePeriod=15 Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424863 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef" gracePeriod=15 Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424882 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c" gracePeriod=15 Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.424798 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373" gracePeriod=15 Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.426529 4791 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427011 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427035 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427049 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427151 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427165 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427172 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427180 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427187 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427195 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427222 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427232 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427238 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 00:09:37 crc kubenswrapper[4791]: E0217 00:09:37.427245 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427253 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427389 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427400 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427408 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427419 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427428 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.427437 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.585967 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586324 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586400 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586502 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586553 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586594 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586616 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.586708 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688308 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688371 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688398 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688426 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688426 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688481 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688492 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688451 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688549 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688572 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688505 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688584 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688637 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688612 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688672 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:37 crc kubenswrapper[4791]: I0217 00:09:37.688634 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.030776 4791 generic.go:334] "Generic (PLEG): container finished" podID="45d65fd1-6366-4ed0-bc40-10d5418435ea" containerID="625e883c4175a7bb832ef122a4de927b44b76c4d7352658a36f562332f5bdbaa" exitCode=0 Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.030891 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45d65fd1-6366-4ed0-bc40-10d5418435ea","Type":"ContainerDied","Data":"625e883c4175a7bb832ef122a4de927b44b76c4d7352658a36f562332f5bdbaa"} Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.032461 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.032924 4791 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.033749 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.034851 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.035740 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef" exitCode=0 Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.035869 4791 scope.go:117] "RemoveContainer" containerID="f23dc23f74e7a7df93ee8e84bbdc66a3ce1b43845c5a9493f135a32d58ab419c" Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.035927 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373" exitCode=0 Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.036065 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0" exitCode=0 Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.036090 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c" exitCode=2 Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.122971 4791 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 17 00:09:38 crc kubenswrapper[4791]: I0217 00:09:38.123059 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 17 00:09:39 crc kubenswrapper[4791]: I0217 00:09:39.842629 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.030372 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.031243 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.032790 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.033602 4791 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.133834 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.134637 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.135128 4791 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221404 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221508 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221512 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221551 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221608 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221679 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221873 4791 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221893 4791 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.221902 4791 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.322689 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock\") pod \"45d65fd1-6366-4ed0-bc40-10d5418435ea\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.322750 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir\") pod \"45d65fd1-6366-4ed0-bc40-10d5418435ea\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.322793 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access\") pod \"45d65fd1-6366-4ed0-bc40-10d5418435ea\" (UID: \"45d65fd1-6366-4ed0-bc40-10d5418435ea\") " Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.322807 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock" (OuterVolumeSpecName: "var-lock") pod "45d65fd1-6366-4ed0-bc40-10d5418435ea" (UID: "45d65fd1-6366-4ed0-bc40-10d5418435ea"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.322872 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "45d65fd1-6366-4ed0-bc40-10d5418435ea" (UID: "45d65fd1-6366-4ed0-bc40-10d5418435ea"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.323163 4791 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.323184 4791 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45d65fd1-6366-4ed0-bc40-10d5418435ea-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.327667 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "45d65fd1-6366-4ed0-bc40-10d5418435ea" (UID: "45d65fd1-6366-4ed0-bc40-10d5418435ea"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.424081 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d65fd1-6366-4ed0-bc40-10d5418435ea-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.853436 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.854443 4791 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118" exitCode=0 Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.854544 4791 scope.go:117] "RemoveContainer" containerID="4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.854555 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.857170 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.857317 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45d65fd1-6366-4ed0-bc40-10d5418435ea","Type":"ContainerDied","Data":"e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd"} Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.857466 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e34b1921e9f7cc6a8f4baa20b25186c7eb9d2fa742e706dd1bd3c235046f30fd" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.872981 4791 scope.go:117] "RemoveContainer" containerID="2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.876285 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.876814 4791 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.884535 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.885942 4791 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.892873 4791 scope.go:117] "RemoveContainer" containerID="878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.907502 4791 scope.go:117] "RemoveContainer" containerID="ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.919216 4791 scope.go:117] "RemoveContainer" containerID="8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.936410 4791 scope.go:117] "RemoveContainer" containerID="8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.976064 4791 scope.go:117] "RemoveContainer" containerID="4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.976512 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\": container with ID starting with 4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef not found: ID does not exist" containerID="4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.976858 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef"} err="failed to get container status \"4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\": rpc error: code = NotFound desc = could not find container \"4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef\": container with ID starting with 4347fc44d6521835920cd73854a1e75b3a02e3a81a6d89495ea8fa934f8a6fef not found: ID does not exist" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.976949 4791 scope.go:117] "RemoveContainer" containerID="2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.977202 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\": container with ID starting with 2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373 not found: ID does not exist" containerID="2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.977275 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373"} err="failed to get container status \"2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\": rpc error: code = NotFound desc = could not find container \"2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373\": container with ID starting with 2cd7501bca9c46992c1220d19c271440098ae338e9719d277589600e82470373 not found: ID does not exist" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.977351 4791 scope.go:117] "RemoveContainer" containerID="878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.977571 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\": container with ID starting with 878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0 not found: ID does not exist" containerID="878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.977652 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0"} err="failed to get container status \"878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\": rpc error: code = NotFound desc = could not find container \"878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0\": container with ID starting with 878645a3c2e900772265aa36f5dd2d275d0227f0de48c36d01038792e7d50bd0 not found: ID does not exist" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.977775 4791 scope.go:117] "RemoveContainer" containerID="ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.978081 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\": container with ID starting with ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c not found: ID does not exist" containerID="ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.978161 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c"} err="failed to get container status \"ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\": rpc error: code = NotFound desc = could not find container \"ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c\": container with ID starting with ddc3064ca0d7c85450217513ae26b9629e9ebc2f86a567dd2862659f11d9820c not found: ID does not exist" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.978233 4791 scope.go:117] "RemoveContainer" containerID="8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.978950 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\": container with ID starting with 8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118 not found: ID does not exist" containerID="8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.979027 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118"} err="failed to get container status \"8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\": rpc error: code = NotFound desc = could not find container \"8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118\": container with ID starting with 8f5185fe715d5f17ae0fc161b970d4e99ce70f5f506f36a9c4700691f12f4118 not found: ID does not exist" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.979099 4791 scope.go:117] "RemoveContainer" containerID="8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf" Feb 17 00:09:40 crc kubenswrapper[4791]: E0217 00:09:40.979727 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\": container with ID starting with 8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf not found: ID does not exist" containerID="8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf" Feb 17 00:09:40 crc kubenswrapper[4791]: I0217 00:09:40.979844 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf"} err="failed to get container status \"8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\": rpc error: code = NotFound desc = could not find container \"8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf\": container with ID starting with 8eba69183344482a96c1e63e2453aa5585287d7035b1bb71cc8cc61ac8d62cdf not found: ID does not exist" Feb 17 00:09:41 crc kubenswrapper[4791]: I0217 00:09:41.235899 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.331167 4791 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.331907 4791 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.332269 4791 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.332603 4791 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.333044 4791 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: I0217 00:09:42.333078 4791 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.333345 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="200ms" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.455404 4791 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:42 crc kubenswrapper[4791]: I0217 00:09:42.456049 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:42 crc kubenswrapper[4791]: W0217 00:09:42.486745 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-4fe5cf9852d4117d643ccc9b78a202850726325147caa3d91b119dfa015acba3 WatchSource:0}: Error finding container 4fe5cf9852d4117d643ccc9b78a202850726325147caa3d91b119dfa015acba3: Status 404 returned error can't find the container with id 4fe5cf9852d4117d643ccc9b78a202850726325147caa3d91b119dfa015acba3 Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.489944 4791 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1894e0203a8ccd1a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 00:09:42.489328922 +0000 UTC m=+239.968841449,LastTimestamp:2026-02-17 00:09:42.489328922 +0000 UTC m=+239.968841449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.534803 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="400ms" Feb 17 00:09:42 crc kubenswrapper[4791]: I0217 00:09:42.870761 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b"} Feb 17 00:09:42 crc kubenswrapper[4791]: I0217 00:09:42.870821 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4fe5cf9852d4117d643ccc9b78a202850726325147caa3d91b119dfa015acba3"} Feb 17 00:09:42 crc kubenswrapper[4791]: I0217 00:09:42.871609 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.871752 4791 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:09:42 crc kubenswrapper[4791]: E0217 00:09:42.936546 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="800ms" Feb 17 00:09:43 crc kubenswrapper[4791]: I0217 00:09:43.224011 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:43 crc kubenswrapper[4791]: E0217 00:09:43.738950 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="1.6s" Feb 17 00:09:44 crc kubenswrapper[4791]: E0217 00:09:44.052326 4791 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1894e0203a8ccd1a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 00:09:42.489328922 +0000 UTC m=+239.968841449,LastTimestamp:2026-02-17 00:09:42.489328922 +0000 UTC m=+239.968841449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 00:09:45 crc kubenswrapper[4791]: E0217 00:09:45.340902 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="3.2s" Feb 17 00:09:48 crc kubenswrapper[4791]: E0217 00:09:48.541755 4791 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="6.4s" Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.920225 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.920568 4791 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86" exitCode=1 Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.920615 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86"} Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.921469 4791 scope.go:117] "RemoveContainer" containerID="d75a45a3d77783390bddaa1c2a5d6e6b4a23273a0c5d3b9b706d1ff1ed016f86" Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.921845 4791 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:50 crc kubenswrapper[4791]: I0217 00:09:50.922547 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.219507 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.220701 4791 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.221439 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.239151 4791 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.239189 4791 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:51 crc kubenswrapper[4791]: E0217 00:09:51.239629 4791 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.240181 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:51 crc kubenswrapper[4791]: W0217 00:09:51.262490 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-2fcb52200f512e5ac2a434dba738911a325516376ae69a3b25c7b55a69b58026 WatchSource:0}: Error finding container 2fcb52200f512e5ac2a434dba738911a325516376ae69a3b25c7b55a69b58026: Status 404 returned error can't find the container with id 2fcb52200f512e5ac2a434dba738911a325516376ae69a3b25c7b55a69b58026 Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.931179 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.931312 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae95a0fefe7b673983f1d92ddb64d23be5f53d379dce7ac00415b0758451432d"} Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.936788 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.937143 4791 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.948769 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"bbb2e296ccc25b602f66a53d548d184bd8e5080c64c88ed3bd87081bfb7eadfe"} Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.948903 4791 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="bbb2e296ccc25b602f66a53d548d184bd8e5080c64c88ed3bd87081bfb7eadfe" exitCode=0 Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.948954 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2fcb52200f512e5ac2a434dba738911a325516376ae69a3b25c7b55a69b58026"} Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.950037 4791 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.950071 4791 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:51 crc kubenswrapper[4791]: E0217 00:09:51.950912 4791 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.954320 4791 status_manager.go:851] "Failed to get status for pod" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:51 crc kubenswrapper[4791]: I0217 00:09:51.954900 4791 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 17 00:09:52 crc kubenswrapper[4791]: I0217 00:09:52.958708 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1993b99b6471a61be1e29e24a843a3e24b3b94f6dce4054676a1305e3a708eba"} Feb 17 00:09:52 crc kubenswrapper[4791]: I0217 00:09:52.959662 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3e5acb7f7789ad110aa8db7ba145560490b34b9358b207d27d4629c2a950d708"} Feb 17 00:09:52 crc kubenswrapper[4791]: I0217 00:09:52.959740 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3565e93f71e21d3ebbec12d754879bbce24af23ef089b372174f14e53d7efa9f"} Feb 17 00:09:53 crc kubenswrapper[4791]: I0217 00:09:53.967828 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"08a2a45703fe50ae521d3213c3369f28e782672e5eb97008b0828e580448ac8d"} Feb 17 00:09:53 crc kubenswrapper[4791]: I0217 00:09:53.969192 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f125604f9d9ac1a341e6af7798be8282fbdae6e0ed3c3a7a897740559fad2602"} Feb 17 00:09:53 crc kubenswrapper[4791]: I0217 00:09:53.969288 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:53 crc kubenswrapper[4791]: I0217 00:09:53.968385 4791 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:53 crc kubenswrapper[4791]: I0217 00:09:53.969435 4791 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:55 crc kubenswrapper[4791]: I0217 00:09:55.401290 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:09:55 crc kubenswrapper[4791]: I0217 00:09:55.405708 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:09:55 crc kubenswrapper[4791]: I0217 00:09:55.983749 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:09:56 crc kubenswrapper[4791]: I0217 00:09:56.240338 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:56 crc kubenswrapper[4791]: I0217 00:09:56.240415 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:56 crc kubenswrapper[4791]: I0217 00:09:56.247788 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:58 crc kubenswrapper[4791]: I0217 00:09:58.977783 4791 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:59 crc kubenswrapper[4791]: I0217 00:09:59.001950 4791 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:59 crc kubenswrapper[4791]: I0217 00:09:59.001992 4791 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:09:59 crc kubenswrapper[4791]: I0217 00:09:59.005692 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:09:59 crc kubenswrapper[4791]: I0217 00:09:59.008803 4791 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="36ee7bce-4ced-431a-b5cd-18db071b5601" Feb 17 00:10:00 crc kubenswrapper[4791]: I0217 00:10:00.005944 4791 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:10:00 crc kubenswrapper[4791]: I0217 00:10:00.005983 4791 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9c4455fb-f818-4b8a-92dc-ac60431b5ff8" Feb 17 00:10:03 crc kubenswrapper[4791]: I0217 00:10:03.242663 4791 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="36ee7bce-4ced-431a-b5cd-18db071b5601" Feb 17 00:10:08 crc kubenswrapper[4791]: I0217 00:10:08.463834 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 00:10:08 crc kubenswrapper[4791]: I0217 00:10:08.945715 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.306245 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.364954 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.444582 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.604530 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.778678 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.787254 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.804741 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 00:10:09 crc kubenswrapper[4791]: I0217 00:10:09.978400 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.276905 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.308086 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.326403 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.510862 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.601093 4791 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.651580 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.685881 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.743584 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 00:10:10 crc kubenswrapper[4791]: I0217 00:10:10.951953 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.290820 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.508434 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.564217 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.574538 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.574600 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.612307 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.617932 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.657580 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.701892 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.770041 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.779802 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.817918 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.892831 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.911893 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.918866 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.943799 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.963872 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.964230 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 00:10:11 crc kubenswrapper[4791]: I0217 00:10:11.988972 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.294886 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.356052 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.417997 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.502258 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.571250 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.754204 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.795338 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.808504 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.842245 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.858626 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.979853 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 00:10:12 crc kubenswrapper[4791]: I0217 00:10:12.992263 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.039473 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.230462 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.272214 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.279423 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.401422 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.444261 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.554088 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.745604 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.760243 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.824954 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.838038 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.843931 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.860831 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.868530 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.892776 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 00:10:13 crc kubenswrapper[4791]: I0217 00:10:13.929380 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.111884 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.114745 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.182645 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.182656 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.315749 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.343814 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.379545 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.399066 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.459874 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.538760 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.571797 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.592396 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.600771 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.716835 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.744230 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.795040 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.807220 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.821585 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.839786 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.884077 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.890277 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.955523 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.971836 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 00:10:14 crc kubenswrapper[4791]: I0217 00:10:14.996727 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.013780 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.027795 4791 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.032613 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.088376 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.136664 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.254966 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.277322 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.395672 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.492816 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.695284 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.705924 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.713514 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.785894 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.812245 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.877676 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 00:10:15 crc kubenswrapper[4791]: I0217 00:10:15.993824 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.106072 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.160954 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.292270 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.442389 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.710237 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.723782 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.770562 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.780471 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.825068 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.860175 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 00:10:16 crc kubenswrapper[4791]: I0217 00:10:16.882816 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.012664 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.022999 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.069587 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.077440 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.100925 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.121385 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.201423 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.202659 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.218632 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.347825 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.380557 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.443690 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.534925 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.570866 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.581961 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.633527 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.653756 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.711193 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.744390 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.748302 4791 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.755338 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.755423 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.761052 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.777401 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.777345548 podStartE2EDuration="19.777345548s" podCreationTimestamp="2026-02-17 00:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:10:17.775370505 +0000 UTC m=+275.254883032" watchObservedRunningTime="2026-02-17 00:10:17.777345548 +0000 UTC m=+275.256858075" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.808623 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.828982 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.855629 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.905819 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.977853 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 00:10:17 crc kubenswrapper[4791]: I0217 00:10:17.985517 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.129518 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.152225 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.167573 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.171589 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.174363 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.175003 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.191572 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.192043 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.244627 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.263190 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.308244 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.362433 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.375632 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.428391 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.484279 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.610911 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.628776 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.644088 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.644365 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.652310 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.851062 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.870247 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.935681 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 00:10:18 crc kubenswrapper[4791]: I0217 00:10:18.979165 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.109039 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.123946 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.162927 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.172616 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.220711 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.360755 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.379742 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.399617 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.452418 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.454520 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.485721 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.489716 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.593290 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.609267 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.612633 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.632780 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.644859 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.650011 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.655208 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.708684 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.717894 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.719493 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.768743 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 00:10:19 crc kubenswrapper[4791]: I0217 00:10:19.963464 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.115593 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.126819 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.285287 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.303731 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.479042 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.592263 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.638231 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.677381 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.702770 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.726553 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.735628 4791 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.742490 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.821823 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.829026 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 00:10:20 crc kubenswrapper[4791]: I0217 00:10:20.996466 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.005464 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.160634 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.194217 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.253049 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.344256 4791 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.344520 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b" gracePeriod=5 Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.355624 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.455493 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.616840 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.646880 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.659162 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 00:10:21 crc kubenswrapper[4791]: I0217 00:10:21.827067 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.085606 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.107256 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.329925 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.345288 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.497846 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.575952 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.607562 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.693296 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.710502 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.779415 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.827635 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.863248 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.882791 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 00:10:22 crc kubenswrapper[4791]: I0217 00:10:22.884258 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.053257 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.074141 4791 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.126366 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.134925 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.217020 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.273955 4791 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.554980 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.595351 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.659358 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.672570 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.828490 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.883497 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 00:10:23 crc kubenswrapper[4791]: I0217 00:10:23.970394 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.209487 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.476473 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.615371 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.623440 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.661727 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 00:10:24 crc kubenswrapper[4791]: I0217 00:10:24.946648 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.019506 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.019729 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cgmd4" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="registry-server" containerID="cri-o://5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01" gracePeriod=30 Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.035606 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.035833 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8xbcp" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="registry-server" containerID="cri-o://0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5" gracePeriod=30 Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.047200 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.047519 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" containerID="cri-o://1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f" gracePeriod=30 Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.059265 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.059571 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h66xr" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="registry-server" containerID="cri-o://5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a" gracePeriod=30 Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.063773 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.064062 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s76xp" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="registry-server" containerID="cri-o://33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1" gracePeriod=30 Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093130 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8x7k"] Feb 17 00:10:25 crc kubenswrapper[4791]: E0217 00:10:25.093386 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" containerName="installer" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093404 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" containerName="installer" Feb 17 00:10:25 crc kubenswrapper[4791]: E0217 00:10:25.093422 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093432 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093540 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d65fd1-6366-4ed0-bc40-10d5418435ea" containerName="installer" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093558 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.093999 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.100621 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8x7k"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.291084 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.291171 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.291221 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxfd\" (UniqueName: \"kubernetes.io/projected/66e06ad0-6874-4a52-94d8-76da74f7336b-kube-api-access-8kxfd\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.394027 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.394194 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.394250 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxfd\" (UniqueName: \"kubernetes.io/projected/66e06ad0-6874-4a52-94d8-76da74f7336b-kube-api-access-8kxfd\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.396277 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.401715 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.403782 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66e06ad0-6874-4a52-94d8-76da74f7336b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.416043 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxfd\" (UniqueName: \"kubernetes.io/projected/66e06ad0-6874-4a52-94d8-76da74f7336b-kube-api-access-8kxfd\") pod \"marketplace-operator-79b997595-t8x7k\" (UID: \"66e06ad0-6874-4a52-94d8-76da74f7336b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.519234 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.539016 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.584510 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.806131 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8x7k"] Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.921783 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:10:25 crc kubenswrapper[4791]: I0217 00:10:25.962584 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.061041 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.065824 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.100062 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105509 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l6xj\" (UniqueName: \"kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj\") pod \"13a5be44-f180-42a9-bff7-8ba69cc589f0\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105591 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content\") pod \"6e6c03f6-847b-402c-bfde-6dd30870b907\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105624 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities\") pod \"6e6c03f6-847b-402c-bfde-6dd30870b907\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105649 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics\") pod \"13a5be44-f180-42a9-bff7-8ba69cc589f0\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105725 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca\") pod \"13a5be44-f180-42a9-bff7-8ba69cc589f0\" (UID: \"13a5be44-f180-42a9-bff7-8ba69cc589f0\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.105776 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h6gj\" (UniqueName: \"kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj\") pod \"6e6c03f6-847b-402c-bfde-6dd30870b907\" (UID: \"6e6c03f6-847b-402c-bfde-6dd30870b907\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.107379 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "13a5be44-f180-42a9-bff7-8ba69cc589f0" (UID: "13a5be44-f180-42a9-bff7-8ba69cc589f0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.110665 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities" (OuterVolumeSpecName: "utilities") pod "6e6c03f6-847b-402c-bfde-6dd30870b907" (UID: "6e6c03f6-847b-402c-bfde-6dd30870b907"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.112316 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj" (OuterVolumeSpecName: "kube-api-access-6h6gj") pod "6e6c03f6-847b-402c-bfde-6dd30870b907" (UID: "6e6c03f6-847b-402c-bfde-6dd30870b907"). InnerVolumeSpecName "kube-api-access-6h6gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.112787 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj" (OuterVolumeSpecName: "kube-api-access-2l6xj") pod "13a5be44-f180-42a9-bff7-8ba69cc589f0" (UID: "13a5be44-f180-42a9-bff7-8ba69cc589f0"). InnerVolumeSpecName "kube-api-access-2l6xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.125782 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "13a5be44-f180-42a9-bff7-8ba69cc589f0" (UID: "13a5be44-f180-42a9-bff7-8ba69cc589f0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.180119 4791 generic.go:334] "Generic (PLEG): container finished" podID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerID="33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1" exitCode=0 Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.180184 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s76xp" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.180188 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerDied","Data":"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.180317 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s76xp" event={"ID":"6e6c03f6-847b-402c-bfde-6dd30870b907","Type":"ContainerDied","Data":"d67dd9afc803b2b8cf537c2d990677521dc5d35daf8d8201c4e1dd9fc3670d22"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.180363 4791 scope.go:117] "RemoveContainer" containerID="33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.182211 4791 generic.go:334] "Generic (PLEG): container finished" podID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerID="5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a" exitCode=0 Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.182283 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerDied","Data":"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.182303 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h66xr" event={"ID":"db1caaaf-7e8b-405c-97ff-7c507f068688","Type":"ContainerDied","Data":"0a2777b10322faf11f31316ab253e0d88d658e157f8af01279ebdea911a277fa"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.182345 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h66xr" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.184730 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" event={"ID":"66e06ad0-6874-4a52-94d8-76da74f7336b","Type":"ContainerStarted","Data":"58eb500bf2f0a5e583661baccb8486b3a1f2366893b2ac87a31be78ba4c00230"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.184757 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" event={"ID":"66e06ad0-6874-4a52-94d8-76da74f7336b","Type":"ContainerStarted","Data":"d5e4395902c8c6c096ebff5df1fd9d99d8fbecbea5c749acb618b7dd081846bf"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.184983 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.186093 4791 generic.go:334] "Generic (PLEG): container finished" podID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerID="1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f" exitCode=0 Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.186238 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" event={"ID":"13a5be44-f180-42a9-bff7-8ba69cc589f0","Type":"ContainerDied","Data":"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.186256 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" event={"ID":"13a5be44-f180-42a9-bff7-8ba69cc589f0","Type":"ContainerDied","Data":"f1a3439d45cbb877a9cdb806affb8d5e0982a3ff436258b9fc60b97b89a3ef01"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.186296 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bfffb" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.187951 4791 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t8x7k container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.58:8080/healthz\": dial tcp 10.217.0.58:8080: connect: connection refused" start-of-body= Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.187988 4791 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" podUID="66e06ad0-6874-4a52-94d8-76da74f7336b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.58:8080/healthz\": dial tcp 10.217.0.58:8080: connect: connection refused" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.189258 4791 generic.go:334] "Generic (PLEG): container finished" podID="48855520-658c-4579-a867-7e984bce56c7" containerID="5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01" exitCode=0 Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.189331 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerDied","Data":"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.189360 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgmd4" event={"ID":"48855520-658c-4579-a867-7e984bce56c7","Type":"ContainerDied","Data":"f3a8bf4a4e4255984cba5a86035a408c84d7e84e14a3acd43f2d8aaf7ecd5cee"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.189426 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgmd4" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.191521 4791 generic.go:334] "Generic (PLEG): container finished" podID="f04d6e19-5c11-4527-8a49-3208098d2575" containerID="0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5" exitCode=0 Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.191557 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerDied","Data":"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.191578 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbcp" event={"ID":"f04d6e19-5c11-4527-8a49-3208098d2575","Type":"ContainerDied","Data":"8181b3c79ba3d257e580e3f1df6f57468b32e1bd3945a81c8cb4156646ea066f"} Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.191647 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbcp" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.201863 4791 scope.go:117] "RemoveContainer" containerID="f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206431 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities\") pod \"f04d6e19-5c11-4527-8a49-3208098d2575\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206485 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2txwc\" (UniqueName: \"kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc\") pod \"db1caaaf-7e8b-405c-97ff-7c507f068688\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206514 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content\") pod \"f04d6e19-5c11-4527-8a49-3208098d2575\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206548 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities\") pod \"48855520-658c-4579-a867-7e984bce56c7\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206580 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blntw\" (UniqueName: \"kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw\") pod \"f04d6e19-5c11-4527-8a49-3208098d2575\" (UID: \"f04d6e19-5c11-4527-8a49-3208098d2575\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206616 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content\") pod \"db1caaaf-7e8b-405c-97ff-7c507f068688\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206659 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq9sd\" (UniqueName: \"kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd\") pod \"48855520-658c-4579-a867-7e984bce56c7\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206690 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content\") pod \"48855520-658c-4579-a867-7e984bce56c7\" (UID: \"48855520-658c-4579-a867-7e984bce56c7\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.206789 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities\") pod \"db1caaaf-7e8b-405c-97ff-7c507f068688\" (UID: \"db1caaaf-7e8b-405c-97ff-7c507f068688\") " Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207021 4791 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207042 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h6gj\" (UniqueName: \"kubernetes.io/projected/6e6c03f6-847b-402c-bfde-6dd30870b907-kube-api-access-6h6gj\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207054 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l6xj\" (UniqueName: \"kubernetes.io/projected/13a5be44-f180-42a9-bff7-8ba69cc589f0-kube-api-access-2l6xj\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207068 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207079 4791 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13a5be44-f180-42a9-bff7-8ba69cc589f0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207842 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities" (OuterVolumeSpecName: "utilities") pod "db1caaaf-7e8b-405c-97ff-7c507f068688" (UID: "db1caaaf-7e8b-405c-97ff-7c507f068688"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.207945 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities" (OuterVolumeSpecName: "utilities") pod "f04d6e19-5c11-4527-8a49-3208098d2575" (UID: "f04d6e19-5c11-4527-8a49-3208098d2575"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.208250 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities" (OuterVolumeSpecName: "utilities") pod "48855520-658c-4579-a867-7e984bce56c7" (UID: "48855520-658c-4579-a867-7e984bce56c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.212022 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw" (OuterVolumeSpecName: "kube-api-access-blntw") pod "f04d6e19-5c11-4527-8a49-3208098d2575" (UID: "f04d6e19-5c11-4527-8a49-3208098d2575"). InnerVolumeSpecName "kube-api-access-blntw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.216817 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd" (OuterVolumeSpecName: "kube-api-access-rq9sd") pod "48855520-658c-4579-a867-7e984bce56c7" (UID: "48855520-658c-4579-a867-7e984bce56c7"). InnerVolumeSpecName "kube-api-access-rq9sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.238122 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" podStartSLOduration=1.238084751 podStartE2EDuration="1.238084751s" podCreationTimestamp="2026-02-17 00:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:10:26.213231063 +0000 UTC m=+283.692743580" watchObservedRunningTime="2026-02-17 00:10:26.238084751 +0000 UTC m=+283.717597278" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.238206 4791 scope.go:117] "RemoveContainer" containerID="ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.238559 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc" (OuterVolumeSpecName: "kube-api-access-2txwc") pod "db1caaaf-7e8b-405c-97ff-7c507f068688" (UID: "db1caaaf-7e8b-405c-97ff-7c507f068688"). InnerVolumeSpecName "kube-api-access-2txwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.244184 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.247862 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bfffb"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.259561 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db1caaaf-7e8b-405c-97ff-7c507f068688" (UID: "db1caaaf-7e8b-405c-97ff-7c507f068688"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.260847 4791 scope.go:117] "RemoveContainer" containerID="33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.261260 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1\": container with ID starting with 33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1 not found: ID does not exist" containerID="33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.261305 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1"} err="failed to get container status \"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1\": rpc error: code = NotFound desc = could not find container \"33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1\": container with ID starting with 33c952b3bdd673e3f38be6e9330aa80bd1a4cfda1479b24aa731cd17ec616cc1 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.261332 4791 scope.go:117] "RemoveContainer" containerID="f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.261970 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5\": container with ID starting with f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5 not found: ID does not exist" containerID="f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.261997 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5"} err="failed to get container status \"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5\": rpc error: code = NotFound desc = could not find container \"f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5\": container with ID starting with f8f32a7e13221c829def6748e59d8da3ed3ad1b8cb9d0ac30deda15bc24545d5 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.262018 4791 scope.go:117] "RemoveContainer" containerID="ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.262639 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8\": container with ID starting with ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8 not found: ID does not exist" containerID="ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.263163 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8"} err="failed to get container status \"ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8\": rpc error: code = NotFound desc = could not find container \"ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8\": container with ID starting with ce53e59e96ea0a38e6dcc5c750617796d07e5b87c6becde6e266b92875a433c8 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.263198 4791 scope.go:117] "RemoveContainer" containerID="5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.276230 4791 scope.go:117] "RemoveContainer" containerID="517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.295298 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e6c03f6-847b-402c-bfde-6dd30870b907" (UID: "6e6c03f6-847b-402c-bfde-6dd30870b907"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.300224 4791 scope.go:117] "RemoveContainer" containerID="498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.301947 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48855520-658c-4579-a867-7e984bce56c7" (UID: "48855520-658c-4579-a867-7e984bce56c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308903 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308935 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308945 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e6c03f6-847b-402c-bfde-6dd30870b907-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308955 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2txwc\" (UniqueName: \"kubernetes.io/projected/db1caaaf-7e8b-405c-97ff-7c507f068688-kube-api-access-2txwc\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308980 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.308993 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blntw\" (UniqueName: \"kubernetes.io/projected/f04d6e19-5c11-4527-8a49-3208098d2575-kube-api-access-blntw\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.309002 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1caaaf-7e8b-405c-97ff-7c507f068688-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.309011 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq9sd\" (UniqueName: \"kubernetes.io/projected/48855520-658c-4579-a867-7e984bce56c7-kube-api-access-rq9sd\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.309020 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48855520-658c-4579-a867-7e984bce56c7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.310526 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f04d6e19-5c11-4527-8a49-3208098d2575" (UID: "f04d6e19-5c11-4527-8a49-3208098d2575"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.315504 4791 scope.go:117] "RemoveContainer" containerID="5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.315985 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a\": container with ID starting with 5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a not found: ID does not exist" containerID="5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.316050 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a"} err="failed to get container status \"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a\": rpc error: code = NotFound desc = could not find container \"5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a\": container with ID starting with 5d4c2a7bdc132569574b78294e0819b4803e57c50fba23d63fb08d7fe94edb8a not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.316092 4791 scope.go:117] "RemoveContainer" containerID="517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.318233 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943\": container with ID starting with 517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943 not found: ID does not exist" containerID="517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.318321 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943"} err="failed to get container status \"517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943\": rpc error: code = NotFound desc = could not find container \"517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943\": container with ID starting with 517005f22a3a08ad5b1e6c06b10a655495fa9186c8d9598bb0d8aad8f87d1943 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.318365 4791 scope.go:117] "RemoveContainer" containerID="498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.318450 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.318842 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd\": container with ID starting with 498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd not found: ID does not exist" containerID="498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.318875 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd"} err="failed to get container status \"498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd\": rpc error: code = NotFound desc = could not find container \"498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd\": container with ID starting with 498836b2c2c73b292dcebe1de7c28ad97044adda878173f23061bacd92d69dfd not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.318904 4791 scope.go:117] "RemoveContainer" containerID="1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.333857 4791 scope.go:117] "RemoveContainer" containerID="1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.334357 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f\": container with ID starting with 1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f not found: ID does not exist" containerID="1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.334396 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f"} err="failed to get container status \"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f\": rpc error: code = NotFound desc = could not find container \"1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f\": container with ID starting with 1fbad1f45455de495c7be98f092d45f87a92cd56249ef29cc851a38fc824a00f not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.334429 4791 scope.go:117] "RemoveContainer" containerID="5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.353145 4791 scope.go:117] "RemoveContainer" containerID="0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.374949 4791 scope.go:117] "RemoveContainer" containerID="bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.389051 4791 scope.go:117] "RemoveContainer" containerID="5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.390159 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01\": container with ID starting with 5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01 not found: ID does not exist" containerID="5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.390202 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01"} err="failed to get container status \"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01\": rpc error: code = NotFound desc = could not find container \"5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01\": container with ID starting with 5c2e0aff4866650debb5175265a6d01c2a0e5b55e721f15c505fa68b2701ca01 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.390232 4791 scope.go:117] "RemoveContainer" containerID="0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.391065 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61\": container with ID starting with 0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61 not found: ID does not exist" containerID="0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.391187 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61"} err="failed to get container status \"0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61\": rpc error: code = NotFound desc = could not find container \"0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61\": container with ID starting with 0adb68d60017352d76c2f5cbf51ce6821a2b97f8be3e0ed37b97bdded9fc7e61 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.391293 4791 scope.go:117] "RemoveContainer" containerID="bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.393766 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82\": container with ID starting with bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82 not found: ID does not exist" containerID="bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.393873 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82"} err="failed to get container status \"bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82\": rpc error: code = NotFound desc = could not find container \"bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82\": container with ID starting with bbbf553ba1a4a0c0e10d191ff91b9ffb504aa8c57407304faeb4477b07b27e82 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.393956 4791 scope.go:117] "RemoveContainer" containerID="0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.408427 4791 scope.go:117] "RemoveContainer" containerID="2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.409640 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d6e19-5c11-4527-8a49-3208098d2575-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.428645 4791 scope.go:117] "RemoveContainer" containerID="18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.449146 4791 scope.go:117] "RemoveContainer" containerID="0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.449649 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5\": container with ID starting with 0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5 not found: ID does not exist" containerID="0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.449694 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5"} err="failed to get container status \"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5\": rpc error: code = NotFound desc = could not find container \"0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5\": container with ID starting with 0221bbce827624c6e65868b10fe478e0cd26d718a850ba8cedf072c22010d1a5 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.449728 4791 scope.go:117] "RemoveContainer" containerID="2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.450138 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2\": container with ID starting with 2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2 not found: ID does not exist" containerID="2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.450170 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2"} err="failed to get container status \"2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2\": rpc error: code = NotFound desc = could not find container \"2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2\": container with ID starting with 2517c8370384ba83de2ad6cdce97d784d0d2c4299af3c5e90d0afd18716185e2 not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.450188 4791 scope.go:117] "RemoveContainer" containerID="18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c" Feb 17 00:10:26 crc kubenswrapper[4791]: E0217 00:10:26.450588 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c\": container with ID starting with 18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c not found: ID does not exist" containerID="18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.450732 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c"} err="failed to get container status \"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c\": rpc error: code = NotFound desc = could not find container \"18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c\": container with ID starting with 18e3187ecd2ce470123e0d8e986a332f912fb582355e7a34e0e65fa2416eee8c not found: ID does not exist" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.570124 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.575360 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cgmd4"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.578265 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.581260 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h66xr"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.595580 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.600397 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8xbcp"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.637414 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.660293 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s76xp"] Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.892461 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 00:10:26 crc kubenswrapper[4791]: I0217 00:10:26.893252 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.019608 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.019744 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.019885 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.019967 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.020041 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.020012 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.020198 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.020309 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.020428 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.021590 4791 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.021720 4791 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.021794 4791 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.021861 4791 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.026408 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.123692 4791 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.206994 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.207182 4791 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b" exitCode=137 Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.207499 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.207610 4791 scope.go:117] "RemoveContainer" containerID="cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.214097 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-t8x7k" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.234585 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" path="/var/lib/kubelet/pods/13a5be44-f180-42a9-bff7-8ba69cc589f0/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.235359 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48855520-658c-4579-a867-7e984bce56c7" path="/var/lib/kubelet/pods/48855520-658c-4579-a867-7e984bce56c7/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.236403 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" path="/var/lib/kubelet/pods/6e6c03f6-847b-402c-bfde-6dd30870b907/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.238268 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" path="/var/lib/kubelet/pods/db1caaaf-7e8b-405c-97ff-7c507f068688/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.239296 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" path="/var/lib/kubelet/pods/f04d6e19-5c11-4527-8a49-3208098d2575/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.245552 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.251294 4791 scope.go:117] "RemoveContainer" containerID="cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b" Feb 17 00:10:27 crc kubenswrapper[4791]: E0217 00:10:27.253419 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b\": container with ID starting with cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b not found: ID does not exist" containerID="cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b" Feb 17 00:10:27 crc kubenswrapper[4791]: I0217 00:10:27.253484 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b"} err="failed to get container status \"cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b\": rpc error: code = NotFound desc = could not find container \"cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b\": container with ID starting with cd581d147e2a01123d1426c686ebf38ddd0bd82af7a233296466aaec95b07c9b not found: ID does not exist" Feb 17 00:10:43 crc kubenswrapper[4791]: I0217 00:10:43.003273 4791 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645304 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-npbnh"] Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645751 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645762 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645772 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645777 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645787 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645794 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645802 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645807 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645814 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645820 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645827 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645833 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645840 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645846 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645855 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645861 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="extract-utilities" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645868 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645873 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645879 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645885 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645894 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645900 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645906 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645912 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="extract-content" Feb 17 00:10:47 crc kubenswrapper[4791]: E0217 00:10:47.645922 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.645934 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.646029 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="48855520-658c-4579-a867-7e984bce56c7" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.646042 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1caaaf-7e8b-405c-97ff-7c507f068688" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.646049 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a5be44-f180-42a9-bff7-8ba69cc589f0" containerName="marketplace-operator" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.646059 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6c03f6-847b-402c-bfde-6dd30870b907" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.646067 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04d6e19-5c11-4527-8a49-3208098d2575" containerName="registry-server" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.648169 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.655388 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.667003 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-npbnh"] Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.792157 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-utilities\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.792238 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-catalog-content\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.792284 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5cnk\" (UniqueName: \"kubernetes.io/projected/c6f055fb-42f4-4699-8dd3-d93710f92ec8-kube-api-access-p5cnk\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.846700 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.847621 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.849785 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.853039 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.892882 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5cnk\" (UniqueName: \"kubernetes.io/projected/c6f055fb-42f4-4699-8dd3-d93710f92ec8-kube-api-access-p5cnk\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.892956 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-utilities\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.893143 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-catalog-content\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.893815 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-utilities\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.900701 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f055fb-42f4-4699-8dd3-d93710f92ec8-catalog-content\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.914278 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5cnk\" (UniqueName: \"kubernetes.io/projected/c6f055fb-42f4-4699-8dd3-d93710f92ec8-kube-api-access-p5cnk\") pod \"certified-operators-npbnh\" (UID: \"c6f055fb-42f4-4699-8dd3-d93710f92ec8\") " pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.936047 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.936348 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerName="controller-manager" containerID="cri-o://eccdaaec958face5d5dcfd6c5fddbc0b4b13a69d1c98503b7abb4bafa6bcd4bc" gracePeriod=30 Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.994637 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-729vq\" (UniqueName: \"kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.994710 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:47 crc kubenswrapper[4791]: I0217 00:10:47.994750 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.005649 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.045786 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.046034 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerName="route-controller-manager" containerID="cri-o://d077feb7a29e2e612d7324e3f4804db6415f0167d613f565b60c6d0f2bcce8f4" gracePeriod=30 Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.097934 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.098176 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.098225 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-729vq\" (UniqueName: \"kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.099288 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.099643 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.127938 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-729vq\" (UniqueName: \"kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq\") pod \"community-operators-v6qjq\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.179964 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.269866 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-npbnh"] Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.363669 4791 generic.go:334] "Generic (PLEG): container finished" podID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerID="d077feb7a29e2e612d7324e3f4804db6415f0167d613f565b60c6d0f2bcce8f4" exitCode=0 Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.363760 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" event={"ID":"6a866a69-9159-4dd1-a03d-b2a0f703fb7b","Type":"ContainerDied","Data":"d077feb7a29e2e612d7324e3f4804db6415f0167d613f565b60c6d0f2bcce8f4"} Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.365195 4791 generic.go:334] "Generic (PLEG): container finished" podID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerID="eccdaaec958face5d5dcfd6c5fddbc0b4b13a69d1c98503b7abb4bafa6bcd4bc" exitCode=0 Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.365263 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" event={"ID":"713c3460-f77d-4f7b-81bf-911f8f875dfe","Type":"ContainerDied","Data":"eccdaaec958face5d5dcfd6c5fddbc0b4b13a69d1c98503b7abb4bafa6bcd4bc"} Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.365288 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" event={"ID":"713c3460-f77d-4f7b-81bf-911f8f875dfe","Type":"ContainerDied","Data":"f303e766e2c893e279f8d6e69a4b7c3a7060f8cee57e4d09d9bd789c6c1a5750"} Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.365302 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f303e766e2c893e279f8d6e69a4b7c3a7060f8cee57e4d09d9bd789c6c1a5750" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.366457 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npbnh" event={"ID":"c6f055fb-42f4-4699-8dd3-d93710f92ec8","Type":"ContainerStarted","Data":"376fa6875549d425cfd29925fcb1e20160a0be88530f720a69bdfeec941be614"} Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.443926 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.481726 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.495061 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:10:48 crc kubenswrapper[4791]: W0217 00:10:48.502182 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeebe5038_a970_42a4_81d4_fa84e6a64dd2.slice/crio-c8fc013d2e008f0f951114c42fdaed6119c2933ea57de36f1dcfd1ae325e546c WatchSource:0}: Error finding container c8fc013d2e008f0f951114c42fdaed6119c2933ea57de36f1dcfd1ae325e546c: Status 404 returned error can't find the container with id c8fc013d2e008f0f951114c42fdaed6119c2933ea57de36f1dcfd1ae325e546c Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602125 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles\") pod \"713c3460-f77d-4f7b-81bf-911f8f875dfe\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602168 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert\") pod \"713c3460-f77d-4f7b-81bf-911f8f875dfe\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602187 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert\") pod \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602234 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca\") pod \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602261 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config\") pod \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602288 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdjvm\" (UniqueName: \"kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm\") pod \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\" (UID: \"6a866a69-9159-4dd1-a03d-b2a0f703fb7b\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602329 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config\") pod \"713c3460-f77d-4f7b-81bf-911f8f875dfe\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602359 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8zlc\" (UniqueName: \"kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc\") pod \"713c3460-f77d-4f7b-81bf-911f8f875dfe\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.602375 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca\") pod \"713c3460-f77d-4f7b-81bf-911f8f875dfe\" (UID: \"713c3460-f77d-4f7b-81bf-911f8f875dfe\") " Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.603101 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "713c3460-f77d-4f7b-81bf-911f8f875dfe" (UID: "713c3460-f77d-4f7b-81bf-911f8f875dfe"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.603336 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config" (OuterVolumeSpecName: "config") pod "713c3460-f77d-4f7b-81bf-911f8f875dfe" (UID: "713c3460-f77d-4f7b-81bf-911f8f875dfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.603623 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca" (OuterVolumeSpecName: "client-ca") pod "6a866a69-9159-4dd1-a03d-b2a0f703fb7b" (UID: "6a866a69-9159-4dd1-a03d-b2a0f703fb7b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.603732 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config" (OuterVolumeSpecName: "config") pod "6a866a69-9159-4dd1-a03d-b2a0f703fb7b" (UID: "6a866a69-9159-4dd1-a03d-b2a0f703fb7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.603925 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca" (OuterVolumeSpecName: "client-ca") pod "713c3460-f77d-4f7b-81bf-911f8f875dfe" (UID: "713c3460-f77d-4f7b-81bf-911f8f875dfe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.607866 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6a866a69-9159-4dd1-a03d-b2a0f703fb7b" (UID: "6a866a69-9159-4dd1-a03d-b2a0f703fb7b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.608486 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "713c3460-f77d-4f7b-81bf-911f8f875dfe" (UID: "713c3460-f77d-4f7b-81bf-911f8f875dfe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.608911 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm" (OuterVolumeSpecName: "kube-api-access-pdjvm") pod "6a866a69-9159-4dd1-a03d-b2a0f703fb7b" (UID: "6a866a69-9159-4dd1-a03d-b2a0f703fb7b"). InnerVolumeSpecName "kube-api-access-pdjvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.609600 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc" (OuterVolumeSpecName: "kube-api-access-k8zlc") pod "713c3460-f77d-4f7b-81bf-911f8f875dfe" (UID: "713c3460-f77d-4f7b-81bf-911f8f875dfe"). InnerVolumeSpecName "kube-api-access-k8zlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704178 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704218 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8zlc\" (UniqueName: \"kubernetes.io/projected/713c3460-f77d-4f7b-81bf-911f8f875dfe-kube-api-access-k8zlc\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704234 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704244 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/713c3460-f77d-4f7b-81bf-911f8f875dfe-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704252 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713c3460-f77d-4f7b-81bf-911f8f875dfe-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704261 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704269 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704277 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:48 crc kubenswrapper[4791]: I0217 00:10:48.704288 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdjvm\" (UniqueName: \"kubernetes.io/projected/6a866a69-9159-4dd1-a03d-b2a0f703fb7b-kube-api-access-pdjvm\") on node \"crc\" DevicePath \"\"" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.373682 4791 generic.go:334] "Generic (PLEG): container finished" podID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerID="737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d" exitCode=0 Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.373772 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerDied","Data":"737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d"} Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.374021 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerStarted","Data":"c8fc013d2e008f0f951114c42fdaed6119c2933ea57de36f1dcfd1ae325e546c"} Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.376518 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" event={"ID":"6a866a69-9159-4dd1-a03d-b2a0f703fb7b","Type":"ContainerDied","Data":"0c85ff67a65174e4212f77cdeae113e56a44995c48f0f9d56c7ed9adda3bd480"} Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.376537 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.377142 4791 scope.go:117] "RemoveContainer" containerID="d077feb7a29e2e612d7324e3f4804db6415f0167d613f565b60c6d0f2bcce8f4" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.388019 4791 generic.go:334] "Generic (PLEG): container finished" podID="c6f055fb-42f4-4699-8dd3-d93710f92ec8" containerID="b8de34051f15482ce1813ccc5a2fddc7f5b160ddd741da04a7a7423df82e67ec" exitCode=0 Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.388138 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jftdn" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.388168 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npbnh" event={"ID":"c6f055fb-42f4-4699-8dd3-d93710f92ec8","Type":"ContainerDied","Data":"b8de34051f15482ce1813ccc5a2fddc7f5b160ddd741da04a7a7423df82e67ec"} Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.422549 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.424626 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ht455"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.430258 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.450848 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jftdn"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.669666 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:10:49 crc kubenswrapper[4791]: E0217 00:10:49.669996 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerName="route-controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.670014 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerName="route-controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: E0217 00:10:49.670031 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerName="controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.670041 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerName="controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.670267 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" containerName="route-controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.670298 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" containerName="controller-manager" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.670731 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.672338 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.672440 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.672945 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.673478 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.673979 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.674519 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.690358 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.691295 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.691629 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.692035 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.692773 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.692780 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.693891 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.695811 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.706245 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.719355 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.732926 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.817953 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcvm8\" (UniqueName: \"kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818247 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818376 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818471 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818555 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818692 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818793 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818897 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.818946 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz76r\" (UniqueName: \"kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.919890 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.919948 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.919987 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920022 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz76r\" (UniqueName: \"kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920057 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcvm8\" (UniqueName: \"kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920090 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920166 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920189 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.920221 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.921014 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.921181 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.921597 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.922014 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.923008 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.925304 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.926177 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.938142 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz76r\" (UniqueName: \"kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r\") pod \"controller-manager-67fbdd65b9-qjx4w\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.944299 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcvm8\" (UniqueName: \"kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8\") pod \"route-controller-manager-76946b564d-qpgps\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:49 crc kubenswrapper[4791]: I0217 00:10:49.998936 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.012976 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.249878 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.251282 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.253470 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.257776 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.324987 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz657\" (UniqueName: \"kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.325058 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.325084 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.396335 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerStarted","Data":"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c"} Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.397926 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npbnh" event={"ID":"c6f055fb-42f4-4699-8dd3-d93710f92ec8","Type":"ContainerStarted","Data":"bc4c8ae0954bf18bfe8f02494a1b79ad9057ab43b8c261668a6653bea1617632"} Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.427199 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz657\" (UniqueName: \"kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.427294 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.427320 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.427779 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.427829 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.449644 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sv4n6"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.451086 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.452689 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.453683 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sv4n6"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.462074 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz657\" (UniqueName: \"kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657\") pod \"redhat-marketplace-lbgxw\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.503041 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.529309 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5m8\" (UniqueName: \"kubernetes.io/projected/f9f068a6-ed4e-4080-a05b-40562b5e8711-kube-api-access-xx5m8\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.529622 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-catalog-content\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.529714 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-utilities\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: W0217 00:10:50.531038 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cd0f241_6d57_4bfa_a4f4_1e8a14005896.slice/crio-3bf4ab2409c10cd4b103249a3e25a8339b444092b7e3b6edfc5d0f89e181ada0 WatchSource:0}: Error finding container 3bf4ab2409c10cd4b103249a3e25a8339b444092b7e3b6edfc5d0f89e181ada0: Status 404 returned error can't find the container with id 3bf4ab2409c10cd4b103249a3e25a8339b444092b7e3b6edfc5d0f89e181ada0 Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.556527 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:10:50 crc kubenswrapper[4791]: W0217 00:10:50.563465 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod717da635_adc5_4037_920f_c0bdec5fe8c2.slice/crio-86de24ff9787ef37003f3a8da115935f7a0998ec6fbd5813a70c5d6002a2b45b WatchSource:0}: Error finding container 86de24ff9787ef37003f3a8da115935f7a0998ec6fbd5813a70c5d6002a2b45b: Status 404 returned error can't find the container with id 86de24ff9787ef37003f3a8da115935f7a0998ec6fbd5813a70c5d6002a2b45b Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.568430 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.630984 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-utilities\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.631090 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5m8\" (UniqueName: \"kubernetes.io/projected/f9f068a6-ed4e-4080-a05b-40562b5e8711-kube-api-access-xx5m8\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.631151 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-catalog-content\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.631608 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-utilities\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.631778 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f068a6-ed4e-4080-a05b-40562b5e8711-catalog-content\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.662498 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5m8\" (UniqueName: \"kubernetes.io/projected/f9f068a6-ed4e-4080-a05b-40562b5e8711-kube-api-access-xx5m8\") pod \"redhat-operators-sv4n6\" (UID: \"f9f068a6-ed4e-4080-a05b-40562b5e8711\") " pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.767546 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:10:50 crc kubenswrapper[4791]: I0217 00:10:50.767556 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.227403 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a866a69-9159-4dd1-a03d-b2a0f703fb7b" path="/var/lib/kubelet/pods/6a866a69-9159-4dd1-a03d-b2a0f703fb7b/volumes" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.228407 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713c3460-f77d-4f7b-81bf-911f8f875dfe" path="/var/lib/kubelet/pods/713c3460-f77d-4f7b-81bf-911f8f875dfe/volumes" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.231276 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sv4n6"] Feb 17 00:10:51 crc kubenswrapper[4791]: W0217 00:10:51.236238 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f068a6_ed4e_4080_a05b_40562b5e8711.slice/crio-46832706ae2418858a2606eb0f704bbf45564ffef987194aac1dc7b9aea8cf5c WatchSource:0}: Error finding container 46832706ae2418858a2606eb0f704bbf45564ffef987194aac1dc7b9aea8cf5c: Status 404 returned error can't find the container with id 46832706ae2418858a2606eb0f704bbf45564ffef987194aac1dc7b9aea8cf5c Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.403475 4791 generic.go:334] "Generic (PLEG): container finished" podID="c6f055fb-42f4-4699-8dd3-d93710f92ec8" containerID="bc4c8ae0954bf18bfe8f02494a1b79ad9057ab43b8c261668a6653bea1617632" exitCode=0 Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.403533 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npbnh" event={"ID":"c6f055fb-42f4-4699-8dd3-d93710f92ec8","Type":"ContainerDied","Data":"bc4c8ae0954bf18bfe8f02494a1b79ad9057ab43b8c261668a6653bea1617632"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.408058 4791 generic.go:334] "Generic (PLEG): container finished" podID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerID="3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195" exitCode=0 Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.408250 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerDied","Data":"3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.408282 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerStarted","Data":"0c0c0ef37f45961765809fc7c0c9b4244d7c69f4387e7e7fe8ce8a7787ea122a"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.416504 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" event={"ID":"6cd0f241-6d57-4bfa-a4f4-1e8a14005896","Type":"ContainerStarted","Data":"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.416549 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" event={"ID":"6cd0f241-6d57-4bfa-a4f4-1e8a14005896","Type":"ContainerStarted","Data":"3bf4ab2409c10cd4b103249a3e25a8339b444092b7e3b6edfc5d0f89e181ada0"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.416910 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.422773 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" event={"ID":"717da635-adc5-4037-920f-c0bdec5fe8c2","Type":"ContainerStarted","Data":"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.422813 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" event={"ID":"717da635-adc5-4037-920f-c0bdec5fe8c2","Type":"ContainerStarted","Data":"86de24ff9787ef37003f3a8da115935f7a0998ec6fbd5813a70c5d6002a2b45b"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.423541 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.432311 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.432352 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.432880 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv4n6" event={"ID":"f9f068a6-ed4e-4080-a05b-40562b5e8711","Type":"ContainerStarted","Data":"80cf4739f1986abc88ab217832093d35a7db6e5168ccfb1b73c108c3d63e7a8d"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.432929 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv4n6" event={"ID":"f9f068a6-ed4e-4080-a05b-40562b5e8711","Type":"ContainerStarted","Data":"46832706ae2418858a2606eb0f704bbf45564ffef987194aac1dc7b9aea8cf5c"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.440729 4791 generic.go:334] "Generic (PLEG): container finished" podID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerID="d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c" exitCode=0 Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.440772 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerDied","Data":"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c"} Feb 17 00:10:51 crc kubenswrapper[4791]: I0217 00:10:51.443527 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" podStartSLOduration=3.443505917 podStartE2EDuration="3.443505917s" podCreationTimestamp="2026-02-17 00:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:10:51.439898473 +0000 UTC m=+308.919411010" watchObservedRunningTime="2026-02-17 00:10:51.443505917 +0000 UTC m=+308.923018454" Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.452125 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerStarted","Data":"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1"} Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.456375 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npbnh" event={"ID":"c6f055fb-42f4-4699-8dd3-d93710f92ec8","Type":"ContainerStarted","Data":"97129438bd2b1068cd253e37059d602f7645f34520aa321c33c29138a41ff427"} Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.458153 4791 generic.go:334] "Generic (PLEG): container finished" podID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerID="e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2" exitCode=0 Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.458195 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerDied","Data":"e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2"} Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.459899 4791 generic.go:334] "Generic (PLEG): container finished" podID="f9f068a6-ed4e-4080-a05b-40562b5e8711" containerID="80cf4739f1986abc88ab217832093d35a7db6e5168ccfb1b73c108c3d63e7a8d" exitCode=0 Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.459936 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv4n6" event={"ID":"f9f068a6-ed4e-4080-a05b-40562b5e8711","Type":"ContainerDied","Data":"80cf4739f1986abc88ab217832093d35a7db6e5168ccfb1b73c108c3d63e7a8d"} Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.474432 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v6qjq" podStartSLOduration=2.7530371970000003 podStartE2EDuration="5.474409459s" podCreationTimestamp="2026-02-17 00:10:47 +0000 UTC" firstStartedPulling="2026-02-17 00:10:49.375901161 +0000 UTC m=+306.855413708" lastFinishedPulling="2026-02-17 00:10:52.097273443 +0000 UTC m=+309.576785970" observedRunningTime="2026-02-17 00:10:52.47413018 +0000 UTC m=+309.953642717" watchObservedRunningTime="2026-02-17 00:10:52.474409459 +0000 UTC m=+309.953921986" Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.475270 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" podStartSLOduration=4.475262866 podStartE2EDuration="4.475262866s" podCreationTimestamp="2026-02-17 00:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:10:51.540928176 +0000 UTC m=+309.020440703" watchObservedRunningTime="2026-02-17 00:10:52.475262866 +0000 UTC m=+309.954775393" Feb 17 00:10:52 crc kubenswrapper[4791]: I0217 00:10:52.511170 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-npbnh" podStartSLOduration=2.893352321 podStartE2EDuration="5.51115428s" podCreationTimestamp="2026-02-17 00:10:47 +0000 UTC" firstStartedPulling="2026-02-17 00:10:49.39025399 +0000 UTC m=+306.869766517" lastFinishedPulling="2026-02-17 00:10:52.008055949 +0000 UTC m=+309.487568476" observedRunningTime="2026-02-17 00:10:52.510161328 +0000 UTC m=+309.989673875" watchObservedRunningTime="2026-02-17 00:10:52.51115428 +0000 UTC m=+309.990666797" Feb 17 00:10:53 crc kubenswrapper[4791]: I0217 00:10:53.465949 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerStarted","Data":"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751"} Feb 17 00:10:53 crc kubenswrapper[4791]: I0217 00:10:53.467523 4791 generic.go:334] "Generic (PLEG): container finished" podID="f9f068a6-ed4e-4080-a05b-40562b5e8711" containerID="04145e421270586091691ca68902e3e565279b19ac1b4a2bc59f3b5979c68b29" exitCode=0 Feb 17 00:10:53 crc kubenswrapper[4791]: I0217 00:10:53.467649 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv4n6" event={"ID":"f9f068a6-ed4e-4080-a05b-40562b5e8711","Type":"ContainerDied","Data":"04145e421270586091691ca68902e3e565279b19ac1b4a2bc59f3b5979c68b29"} Feb 17 00:10:53 crc kubenswrapper[4791]: I0217 00:10:53.499402 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lbgxw" podStartSLOduration=2.088926142 podStartE2EDuration="3.499372346s" podCreationTimestamp="2026-02-17 00:10:50 +0000 UTC" firstStartedPulling="2026-02-17 00:10:51.410876735 +0000 UTC m=+308.890389262" lastFinishedPulling="2026-02-17 00:10:52.821322939 +0000 UTC m=+310.300835466" observedRunningTime="2026-02-17 00:10:53.495941228 +0000 UTC m=+310.975453765" watchObservedRunningTime="2026-02-17 00:10:53.499372346 +0000 UTC m=+310.978884883" Feb 17 00:10:54 crc kubenswrapper[4791]: I0217 00:10:54.473971 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv4n6" event={"ID":"f9f068a6-ed4e-4080-a05b-40562b5e8711","Type":"ContainerStarted","Data":"02221e817a7182ca546ec97737c6aac24abf44e4afca0417aa3e9df285e3c9e7"} Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.007300 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.007591 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.052922 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.074675 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sv4n6" podStartSLOduration=5.610781404 podStartE2EDuration="8.074660176s" podCreationTimestamp="2026-02-17 00:10:50 +0000 UTC" firstStartedPulling="2026-02-17 00:10:51.434305248 +0000 UTC m=+308.913817775" lastFinishedPulling="2026-02-17 00:10:53.89818402 +0000 UTC m=+311.377696547" observedRunningTime="2026-02-17 00:10:54.49455395 +0000 UTC m=+311.974066487" watchObservedRunningTime="2026-02-17 00:10:58.074660176 +0000 UTC m=+315.554172713" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.181147 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.181228 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.221784 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.532958 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-npbnh" Feb 17 00:10:58 crc kubenswrapper[4791]: I0217 00:10:58.555131 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.569607 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.569651 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.633466 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.768147 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.768220 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:11:00 crc kubenswrapper[4791]: I0217 00:11:00.836497 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:11:01 crc kubenswrapper[4791]: I0217 00:11:01.578659 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sv4n6" Feb 17 00:11:01 crc kubenswrapper[4791]: I0217 00:11:01.580833 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:11:07 crc kubenswrapper[4791]: I0217 00:11:07.939801 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:11:07 crc kubenswrapper[4791]: I0217 00:11:07.941805 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" podUID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" containerName="controller-manager" containerID="cri-o://efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea" gracePeriod=30 Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.446211 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.548336 4791 generic.go:334] "Generic (PLEG): container finished" podID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" containerID="efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea" exitCode=0 Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.548382 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" event={"ID":"6cd0f241-6d57-4bfa-a4f4-1e8a14005896","Type":"ContainerDied","Data":"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea"} Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.548414 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" event={"ID":"6cd0f241-6d57-4bfa-a4f4-1e8a14005896","Type":"ContainerDied","Data":"3bf4ab2409c10cd4b103249a3e25a8339b444092b7e3b6edfc5d0f89e181ada0"} Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.548430 4791 scope.go:117] "RemoveContainer" containerID="efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.548452 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.566618 4791 scope.go:117] "RemoveContainer" containerID="efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.567904 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert\") pod \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.567955 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles\") pod \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " Feb 17 00:11:08 crc kubenswrapper[4791]: E0217 00:11:08.567953 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea\": container with ID starting with efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea not found: ID does not exist" containerID="efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.568000 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca\") pod \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.568000 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea"} err="failed to get container status \"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea\": rpc error: code = NotFound desc = could not find container \"efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea\": container with ID starting with efcf75944c8224e17969ab0c4a98ab3fa49b1d194215b0d37268e6cff63187ea not found: ID does not exist" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.568085 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config\") pod \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.568164 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz76r\" (UniqueName: \"kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r\") pod \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\" (UID: \"6cd0f241-6d57-4bfa-a4f4-1e8a14005896\") " Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.569355 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6cd0f241-6d57-4bfa-a4f4-1e8a14005896" (UID: "6cd0f241-6d57-4bfa-a4f4-1e8a14005896"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.569399 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config" (OuterVolumeSpecName: "config") pod "6cd0f241-6d57-4bfa-a4f4-1e8a14005896" (UID: "6cd0f241-6d57-4bfa-a4f4-1e8a14005896"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.569431 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca" (OuterVolumeSpecName: "client-ca") pod "6cd0f241-6d57-4bfa-a4f4-1e8a14005896" (UID: "6cd0f241-6d57-4bfa-a4f4-1e8a14005896"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.577529 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r" (OuterVolumeSpecName: "kube-api-access-vz76r") pod "6cd0f241-6d57-4bfa-a4f4-1e8a14005896" (UID: "6cd0f241-6d57-4bfa-a4f4-1e8a14005896"). InnerVolumeSpecName "kube-api-access-vz76r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.589489 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6cd0f241-6d57-4bfa-a4f4-1e8a14005896" (UID: "6cd0f241-6d57-4bfa-a4f4-1e8a14005896"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.669522 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.669559 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.669573 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz76r\" (UniqueName: \"kubernetes.io/projected/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-kube-api-access-vz76r\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.669587 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.669600 4791 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cd0f241-6d57-4bfa-a4f4-1e8a14005896-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.879207 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:11:08 crc kubenswrapper[4791]: I0217 00:11:08.887508 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67fbdd65b9-qjx4w"] Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.233622 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" path="/var/lib/kubelet/pods/6cd0f241-6d57-4bfa-a4f4-1e8a14005896/volumes" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.681880 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-568f8d896-cssls"] Feb 17 00:11:09 crc kubenswrapper[4791]: E0217 00:11:09.682195 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" containerName="controller-manager" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.682217 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" containerName="controller-manager" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.682342 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd0f241-6d57-4bfa-a4f4-1e8a14005896" containerName="controller-manager" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.682843 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.685758 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.685859 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.686226 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.686274 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.686392 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.686408 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.694367 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-568f8d896-cssls"] Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.699275 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.783714 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzckm\" (UniqueName: \"kubernetes.io/projected/2f391307-5f7e-434c-b3a8-8a10278deaa7-kube-api-access-qzckm\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.783787 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-config\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.783813 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f391307-5f7e-434c-b3a8-8a10278deaa7-serving-cert\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.783915 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-client-ca\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.784039 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-proxy-ca-bundles\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.885343 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-config\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.885423 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f391307-5f7e-434c-b3a8-8a10278deaa7-serving-cert\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.885466 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-client-ca\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.885569 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-proxy-ca-bundles\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.885623 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzckm\" (UniqueName: \"kubernetes.io/projected/2f391307-5f7e-434c-b3a8-8a10278deaa7-kube-api-access-qzckm\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.887049 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-client-ca\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.888541 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-proxy-ca-bundles\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.888875 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f391307-5f7e-434c-b3a8-8a10278deaa7-config\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.893521 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f391307-5f7e-434c-b3a8-8a10278deaa7-serving-cert\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:09 crc kubenswrapper[4791]: I0217 00:11:09.907073 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzckm\" (UniqueName: \"kubernetes.io/projected/2f391307-5f7e-434c-b3a8-8a10278deaa7-kube-api-access-qzckm\") pod \"controller-manager-568f8d896-cssls\" (UID: \"2f391307-5f7e-434c-b3a8-8a10278deaa7\") " pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:10 crc kubenswrapper[4791]: I0217 00:11:10.001806 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:10 crc kubenswrapper[4791]: I0217 00:11:10.452302 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-568f8d896-cssls"] Feb 17 00:11:10 crc kubenswrapper[4791]: W0217 00:11:10.461199 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f391307_5f7e_434c_b3a8_8a10278deaa7.slice/crio-bb37b7d862e58b4555e4de812efccc8f94c99621ae054412c565f5e3958552d8 WatchSource:0}: Error finding container bb37b7d862e58b4555e4de812efccc8f94c99621ae054412c565f5e3958552d8: Status 404 returned error can't find the container with id bb37b7d862e58b4555e4de812efccc8f94c99621ae054412c565f5e3958552d8 Feb 17 00:11:10 crc kubenswrapper[4791]: I0217 00:11:10.563485 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" event={"ID":"2f391307-5f7e-434c-b3a8-8a10278deaa7","Type":"ContainerStarted","Data":"bb37b7d862e58b4555e4de812efccc8f94c99621ae054412c565f5e3958552d8"} Feb 17 00:11:11 crc kubenswrapper[4791]: I0217 00:11:11.568663 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" event={"ID":"2f391307-5f7e-434c-b3a8-8a10278deaa7","Type":"ContainerStarted","Data":"95069b9be18ff43096a939103e2b7b0e6f8b87a7fef8214af4141bce49ce0930"} Feb 17 00:11:11 crc kubenswrapper[4791]: I0217 00:11:11.568962 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:11 crc kubenswrapper[4791]: I0217 00:11:11.574183 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" Feb 17 00:11:11 crc kubenswrapper[4791]: I0217 00:11:11.591911 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-568f8d896-cssls" podStartSLOduration=4.591892723 podStartE2EDuration="4.591892723s" podCreationTimestamp="2026-02-17 00:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:11:11.590906034 +0000 UTC m=+329.070418571" watchObservedRunningTime="2026-02-17 00:11:11.591892723 +0000 UTC m=+329.071405250" Feb 17 00:11:27 crc kubenswrapper[4791]: I0217 00:11:27.926928 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:11:27 crc kubenswrapper[4791]: I0217 00:11:27.927694 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" podUID="717da635-adc5-4037-920f-c0bdec5fe8c2" containerName="route-controller-manager" containerID="cri-o://a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7" gracePeriod=30 Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.350528 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.431647 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcvm8\" (UniqueName: \"kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8\") pod \"717da635-adc5-4037-920f-c0bdec5fe8c2\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.431728 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca\") pod \"717da635-adc5-4037-920f-c0bdec5fe8c2\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.431770 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config\") pod \"717da635-adc5-4037-920f-c0bdec5fe8c2\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.431797 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert\") pod \"717da635-adc5-4037-920f-c0bdec5fe8c2\" (UID: \"717da635-adc5-4037-920f-c0bdec5fe8c2\") " Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.432439 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca" (OuterVolumeSpecName: "client-ca") pod "717da635-adc5-4037-920f-c0bdec5fe8c2" (UID: "717da635-adc5-4037-920f-c0bdec5fe8c2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.432627 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config" (OuterVolumeSpecName: "config") pod "717da635-adc5-4037-920f-c0bdec5fe8c2" (UID: "717da635-adc5-4037-920f-c0bdec5fe8c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.437474 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8" (OuterVolumeSpecName: "kube-api-access-xcvm8") pod "717da635-adc5-4037-920f-c0bdec5fe8c2" (UID: "717da635-adc5-4037-920f-c0bdec5fe8c2"). InnerVolumeSpecName "kube-api-access-xcvm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.445223 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "717da635-adc5-4037-920f-c0bdec5fe8c2" (UID: "717da635-adc5-4037-920f-c0bdec5fe8c2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.532640 4791 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/717da635-adc5-4037-920f-c0bdec5fe8c2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.532683 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcvm8\" (UniqueName: \"kubernetes.io/projected/717da635-adc5-4037-920f-c0bdec5fe8c2-kube-api-access-xcvm8\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.532696 4791 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.532706 4791 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717da635-adc5-4037-920f-c0bdec5fe8c2-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.672332 4791 generic.go:334] "Generic (PLEG): container finished" podID="717da635-adc5-4037-920f-c0bdec5fe8c2" containerID="a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7" exitCode=0 Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.672381 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.672394 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" event={"ID":"717da635-adc5-4037-920f-c0bdec5fe8c2","Type":"ContainerDied","Data":"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7"} Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.672439 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps" event={"ID":"717da635-adc5-4037-920f-c0bdec5fe8c2","Type":"ContainerDied","Data":"86de24ff9787ef37003f3a8da115935f7a0998ec6fbd5813a70c5d6002a2b45b"} Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.672471 4791 scope.go:117] "RemoveContainer" containerID="a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.693439 4791 scope.go:117] "RemoveContainer" containerID="a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7" Feb 17 00:11:28 crc kubenswrapper[4791]: E0217 00:11:28.697030 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7\": container with ID starting with a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7 not found: ID does not exist" containerID="a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.697083 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7"} err="failed to get container status \"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7\": rpc error: code = NotFound desc = could not find container \"a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7\": container with ID starting with a09b93aa566280e4c2ff23fa2944ebddfee0fcbfb6b7c1ab281b7b4af49fa5c7 not found: ID does not exist" Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.707603 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:11:28 crc kubenswrapper[4791]: I0217 00:11:28.710396 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-qpgps"] Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.230788 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717da635-adc5-4037-920f-c0bdec5fe8c2" path="/var/lib/kubelet/pods/717da635-adc5-4037-920f-c0bdec5fe8c2/volumes" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.692768 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56"] Feb 17 00:11:29 crc kubenswrapper[4791]: E0217 00:11:29.693437 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717da635-adc5-4037-920f-c0bdec5fe8c2" containerName="route-controller-manager" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.693458 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="717da635-adc5-4037-920f-c0bdec5fe8c2" containerName="route-controller-manager" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.693678 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="717da635-adc5-4037-920f-c0bdec5fe8c2" containerName="route-controller-manager" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.694467 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.698297 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.699010 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.699903 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.700325 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.705688 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.706281 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.714470 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56"] Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.748393 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-serving-cert\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.748747 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-config\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.748918 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-client-ca\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.749033 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcq9b\" (UniqueName: \"kubernetes.io/projected/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-kube-api-access-qcq9b\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.850698 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-serving-cert\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.850809 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-config\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.850894 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-client-ca\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.850931 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcq9b\" (UniqueName: \"kubernetes.io/projected/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-kube-api-access-qcq9b\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.853139 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-client-ca\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.853810 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-config\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.856644 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-serving-cert\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:29 crc kubenswrapper[4791]: I0217 00:11:29.889348 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcq9b\" (UniqueName: \"kubernetes.io/projected/e8021891-e951-4ce4-bfc5-22c78ac8d0c2-kube-api-access-qcq9b\") pod \"route-controller-manager-6c5f564774-2dv56\" (UID: \"e8021891-e951-4ce4-bfc5-22c78ac8d0c2\") " pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.034896 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.479091 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56"] Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.688336 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" event={"ID":"e8021891-e951-4ce4-bfc5-22c78ac8d0c2","Type":"ContainerStarted","Data":"d9a3e0d0ff27ebc95a49c73d761ad6d0b1a174dd5a21a00e31b713daf40a5d4a"} Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.688695 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.688709 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" event={"ID":"e8021891-e951-4ce4-bfc5-22c78ac8d0c2","Type":"ContainerStarted","Data":"0762f5fbe709465d429332d01dd3c07ae7649c8132c8ffc6d91ebe8371f3c7ab"} Feb 17 00:11:30 crc kubenswrapper[4791]: I0217 00:11:30.709559 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" podStartSLOduration=3.7095335499999997 podStartE2EDuration="3.70953355s" podCreationTimestamp="2026-02-17 00:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:11:30.707207619 +0000 UTC m=+348.186720146" watchObservedRunningTime="2026-02-17 00:11:30.70953355 +0000 UTC m=+348.189046117" Feb 17 00:11:31 crc kubenswrapper[4791]: I0217 00:11:31.366754 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c5f564774-2dv56" Feb 17 00:11:54 crc kubenswrapper[4791]: I0217 00:11:54.973456 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:11:54 crc kubenswrapper[4791]: I0217 00:11:54.974193 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.219764 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vxszn"] Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.221183 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.232530 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vxszn"] Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284124 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgsg7\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-kube-api-access-xgsg7\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284193 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78c3872a-ce72-48e1-aee7-a3a20b86759c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284218 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-trusted-ca\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284329 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284368 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78c3872a-ce72-48e1-aee7-a3a20b86759c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284389 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-tls\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284431 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-certificates\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.284456 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-bound-sa-token\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.328253 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386072 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78c3872a-ce72-48e1-aee7-a3a20b86759c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386136 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-tls\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386171 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-certificates\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386209 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-bound-sa-token\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386247 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgsg7\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-kube-api-access-xgsg7\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386265 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78c3872a-ce72-48e1-aee7-a3a20b86759c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.386294 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-trusted-ca\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.387674 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78c3872a-ce72-48e1-aee7-a3a20b86759c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.387767 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-trusted-ca\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.387697 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-certificates\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.397462 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78c3872a-ce72-48e1-aee7-a3a20b86759c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.397491 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-registry-tls\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.403915 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-bound-sa-token\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.418229 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgsg7\" (UniqueName: \"kubernetes.io/projected/78c3872a-ce72-48e1-aee7-a3a20b86759c-kube-api-access-xgsg7\") pod \"image-registry-66df7c8f76-vxszn\" (UID: \"78c3872a-ce72-48e1-aee7-a3a20b86759c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.541657 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:00 crc kubenswrapper[4791]: I0217 00:12:00.993273 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vxszn"] Feb 17 00:12:01 crc kubenswrapper[4791]: W0217 00:12:01.003565 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78c3872a_ce72_48e1_aee7_a3a20b86759c.slice/crio-1adec818db4a1852947bbd28a587f809f461d74255ea5841a05a09e1a3570f95 WatchSource:0}: Error finding container 1adec818db4a1852947bbd28a587f809f461d74255ea5841a05a09e1a3570f95: Status 404 returned error can't find the container with id 1adec818db4a1852947bbd28a587f809f461d74255ea5841a05a09e1a3570f95 Feb 17 00:12:01 crc kubenswrapper[4791]: I0217 00:12:01.892673 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" event={"ID":"78c3872a-ce72-48e1-aee7-a3a20b86759c","Type":"ContainerStarted","Data":"df61d29ed86092cabced97f03c512c2cd2b013edaccab6e732d41f29b84f9639"} Feb 17 00:12:01 crc kubenswrapper[4791]: I0217 00:12:01.893010 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" event={"ID":"78c3872a-ce72-48e1-aee7-a3a20b86759c","Type":"ContainerStarted","Data":"1adec818db4a1852947bbd28a587f809f461d74255ea5841a05a09e1a3570f95"} Feb 17 00:12:01 crc kubenswrapper[4791]: I0217 00:12:01.893033 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:01 crc kubenswrapper[4791]: I0217 00:12:01.927155 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" podStartSLOduration=1.9270866500000001 podStartE2EDuration="1.92708665s" podCreationTimestamp="2026-02-17 00:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:12:01.918482939 +0000 UTC m=+379.397995496" watchObservedRunningTime="2026-02-17 00:12:01.92708665 +0000 UTC m=+379.406599217" Feb 17 00:12:20 crc kubenswrapper[4791]: I0217 00:12:20.549397 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vxszn" Feb 17 00:12:20 crc kubenswrapper[4791]: I0217 00:12:20.617029 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:12:24 crc kubenswrapper[4791]: I0217 00:12:24.973532 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:12:24 crc kubenswrapper[4791]: I0217 00:12:24.974199 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:12:45 crc kubenswrapper[4791]: I0217 00:12:45.669337 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" podUID="c33165ce-519a-4b0e-b62a-f153d38fc14c" containerName="registry" containerID="cri-o://13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3" gracePeriod=30 Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.063594 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.167276 4791 generic.go:334] "Generic (PLEG): container finished" podID="c33165ce-519a-4b0e-b62a-f153d38fc14c" containerID="13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3" exitCode=0 Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.167338 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" event={"ID":"c33165ce-519a-4b0e-b62a-f153d38fc14c","Type":"ContainerDied","Data":"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3"} Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.167375 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" event={"ID":"c33165ce-519a-4b0e-b62a-f153d38fc14c","Type":"ContainerDied","Data":"a7ca3d122288b33fa8a847c41ca82278c8736040a34df9221628fbf85c038b55"} Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.167400 4791 scope.go:117] "RemoveContainer" containerID="13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.167560 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wfpf2" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182283 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182567 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182629 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182678 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182743 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182801 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182837 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.182875 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptqw5\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5\") pod \"c33165ce-519a-4b0e-b62a-f153d38fc14c\" (UID: \"c33165ce-519a-4b0e-b62a-f153d38fc14c\") " Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.184493 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.186381 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.197634 4791 scope.go:117] "RemoveContainer" containerID="13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.197850 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.198436 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: E0217 00:12:46.199683 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3\": container with ID starting with 13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3 not found: ID does not exist" containerID="13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.199731 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3"} err="failed to get container status \"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3\": rpc error: code = NotFound desc = could not find container \"13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3\": container with ID starting with 13523832c1775f98b7ffe1d732a20c6ea35372708ee330562d595940dbcc07f3 not found: ID does not exist" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.200470 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.201451 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5" (OuterVolumeSpecName: "kube-api-access-ptqw5") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "kube-api-access-ptqw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.207966 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.209222 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c33165ce-519a-4b0e-b62a-f153d38fc14c" (UID: "c33165ce-519a-4b0e-b62a-f153d38fc14c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.283914 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptqw5\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-kube-api-access-ptqw5\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.283956 4791 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c33165ce-519a-4b0e-b62a-f153d38fc14c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.283972 4791 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.284150 4791 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.284219 4791 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c33165ce-519a-4b0e-b62a-f153d38fc14c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.284240 4791 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.284255 4791 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c33165ce-519a-4b0e-b62a-f153d38fc14c-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.520765 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:12:46 crc kubenswrapper[4791]: I0217 00:12:46.529475 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wfpf2"] Feb 17 00:12:47 crc kubenswrapper[4791]: I0217 00:12:47.233422 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c33165ce-519a-4b0e-b62a-f153d38fc14c" path="/var/lib/kubelet/pods/c33165ce-519a-4b0e-b62a-f153d38fc14c/volumes" Feb 17 00:12:54 crc kubenswrapper[4791]: I0217 00:12:54.972676 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:12:54 crc kubenswrapper[4791]: I0217 00:12:54.973046 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:12:54 crc kubenswrapper[4791]: I0217 00:12:54.973154 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:12:54 crc kubenswrapper[4791]: I0217 00:12:54.973942 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:12:54 crc kubenswrapper[4791]: I0217 00:12:54.974034 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f" gracePeriod=600 Feb 17 00:12:55 crc kubenswrapper[4791]: I0217 00:12:55.237409 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f" exitCode=0 Feb 17 00:12:55 crc kubenswrapper[4791]: I0217 00:12:55.237621 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f"} Feb 17 00:12:55 crc kubenswrapper[4791]: I0217 00:12:55.238217 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4"} Feb 17 00:12:55 crc kubenswrapper[4791]: I0217 00:12:55.238252 4791 scope.go:117] "RemoveContainer" containerID="7560e8af1181f00f58063e6281a2e1611f98638fd398828387c074e50a051b28" Feb 17 00:14:43 crc kubenswrapper[4791]: I0217 00:14:43.454304 4791 scope.go:117] "RemoveContainer" containerID="eccdaaec958face5d5dcfd6c5fddbc0b4b13a69d1c98503b7abb4bafa6bcd4bc" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.168374 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9"] Feb 17 00:15:00 crc kubenswrapper[4791]: E0217 00:15:00.169886 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33165ce-519a-4b0e-b62a-f153d38fc14c" containerName="registry" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.169969 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33165ce-519a-4b0e-b62a-f153d38fc14c" containerName="registry" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.170143 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="c33165ce-519a-4b0e-b62a-f153d38fc14c" containerName="registry" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.170549 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.172384 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.172852 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.182000 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9"] Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.291550 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtccn\" (UniqueName: \"kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.291934 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.292096 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.392662 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.392718 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtccn\" (UniqueName: \"kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.392759 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.393616 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.409062 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.410087 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtccn\" (UniqueName: \"kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn\") pod \"collect-profiles-29521455-psxw9\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.491975 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:00 crc kubenswrapper[4791]: I0217 00:15:00.689938 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9"] Feb 17 00:15:01 crc kubenswrapper[4791]: I0217 00:15:01.205428 4791 generic.go:334] "Generic (PLEG): container finished" podID="369d6cd5-3681-44c7-b799-b8e0e9bf2a65" containerID="99223c1a137749948cf8689e01f08f71f3aa17cb6c92040b021d417d8ae7e17e" exitCode=0 Feb 17 00:15:01 crc kubenswrapper[4791]: I0217 00:15:01.205619 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" event={"ID":"369d6cd5-3681-44c7-b799-b8e0e9bf2a65","Type":"ContainerDied","Data":"99223c1a137749948cf8689e01f08f71f3aa17cb6c92040b021d417d8ae7e17e"} Feb 17 00:15:01 crc kubenswrapper[4791]: I0217 00:15:01.207331 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" event={"ID":"369d6cd5-3681-44c7-b799-b8e0e9bf2a65","Type":"ContainerStarted","Data":"a142eef943ee69deda70b59ed181fe0b2f60b27aafc5b4fe4a8fe33752d4aff1"} Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.545100 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.618015 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume\") pod \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.618251 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume\") pod \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.618337 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtccn\" (UniqueName: \"kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn\") pod \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\" (UID: \"369d6cd5-3681-44c7-b799-b8e0e9bf2a65\") " Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.618996 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume" (OuterVolumeSpecName: "config-volume") pod "369d6cd5-3681-44c7-b799-b8e0e9bf2a65" (UID: "369d6cd5-3681-44c7-b799-b8e0e9bf2a65"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.623604 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "369d6cd5-3681-44c7-b799-b8e0e9bf2a65" (UID: "369d6cd5-3681-44c7-b799-b8e0e9bf2a65"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.623989 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn" (OuterVolumeSpecName: "kube-api-access-jtccn") pod "369d6cd5-3681-44c7-b799-b8e0e9bf2a65" (UID: "369d6cd5-3681-44c7-b799-b8e0e9bf2a65"). InnerVolumeSpecName "kube-api-access-jtccn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.720621 4791 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.720672 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtccn\" (UniqueName: \"kubernetes.io/projected/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-kube-api-access-jtccn\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:02 crc kubenswrapper[4791]: I0217 00:15:02.720707 4791 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/369d6cd5-3681-44c7-b799-b8e0e9bf2a65-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:03 crc kubenswrapper[4791]: I0217 00:15:03.227027 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" Feb 17 00:15:03 crc kubenswrapper[4791]: I0217 00:15:03.231613 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521455-psxw9" event={"ID":"369d6cd5-3681-44c7-b799-b8e0e9bf2a65","Type":"ContainerDied","Data":"a142eef943ee69deda70b59ed181fe0b2f60b27aafc5b4fe4a8fe33752d4aff1"} Feb 17 00:15:03 crc kubenswrapper[4791]: I0217 00:15:03.231684 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a142eef943ee69deda70b59ed181fe0b2f60b27aafc5b4fe4a8fe33752d4aff1" Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.209303 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hldzt"] Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210527 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-controller" containerID="cri-o://142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210583 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="nbdb" containerID="cri-o://605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210657 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="sbdb" containerID="cri-o://ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210755 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-acl-logging" containerID="cri-o://e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210774 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210790 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-node" containerID="cri-o://74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.210803 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="northd" containerID="cri-o://12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" gracePeriod=30 Feb 17 00:15:06 crc kubenswrapper[4791]: I0217 00:15:06.274896 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" containerID="cri-o://ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" gracePeriod=30 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.036602 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/3.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.043434 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovn-acl-logging/0.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.044735 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovn-controller/0.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.045665 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.140175 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6mh9d"] Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.140718 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.140858 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.140939 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="nbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141010 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="nbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141082 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="northd" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141182 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="northd" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141263 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141328 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141403 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="sbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141471 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="sbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141537 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141608 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141681 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369d6cd5-3681-44c7-b799-b8e0e9bf2a65" containerName="collect-profiles" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141747 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="369d6cd5-3681-44c7-b799-b8e0e9bf2a65" containerName="collect-profiles" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141809 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-node" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.141885 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-node" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.141954 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142016 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.142089 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142184 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.142254 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kubecfg-setup" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142329 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kubecfg-setup" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.142400 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-acl-logging" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142465 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-acl-logging" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142653 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="sbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142734 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-acl-logging" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142810 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142879 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="nbdb" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.142948 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143020 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="northd" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143088 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-node" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143209 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143274 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143333 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143401 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143467 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="369d6cd5-3681-44c7-b799-b8e0e9bf2a65" containerName="collect-profiles" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143534 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovn-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.143706 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.143771 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.144050 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.144144 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerName="ovnkube-controller" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.146499 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185226 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185285 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185321 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185359 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185402 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185438 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185469 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185499 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185533 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185567 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r26vg\" (UniqueName: \"kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185599 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185629 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185657 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185659 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185699 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185734 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185752 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185772 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185808 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185844 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185844 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185892 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185937 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185967 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd\") pod \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\" (UID: \"e7fe508f-1e8c-4da7-8f99-108e73cb3791\") " Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186296 4791 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186325 4791 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186346 4791 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186362 4791 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185890 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185932 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.185967 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186561 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash" (OuterVolumeSpecName: "host-slash") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.186608 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187427 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187476 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187523 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187594 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187627 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.187636 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket" (OuterVolumeSpecName: "log-socket") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.188057 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.189619 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log" (OuterVolumeSpecName: "node-log") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.194831 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg" (OuterVolumeSpecName: "kube-api-access-r26vg") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "kube-api-access-r26vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.194849 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.206427 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e7fe508f-1e8c-4da7-8f99-108e73cb3791" (UID: "e7fe508f-1e8c-4da7-8f99-108e73cb3791"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.278280 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovnkube-controller/3.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.281206 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovn-acl-logging/0.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.281743 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hldzt_e7fe508f-1e8c-4da7-8f99-108e73cb3791/ovn-controller/0.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282243 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282289 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282302 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282311 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282319 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282327 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" exitCode=0 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282335 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" exitCode=143 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282344 4791 generic.go:334] "Generic (PLEG): container finished" podID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" exitCode=143 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282359 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282398 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282409 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282418 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282396 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282429 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282501 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282523 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282536 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282570 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282578 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282585 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282592 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282599 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282604 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282611 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282622 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282662 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282672 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282680 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282687 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282733 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282746 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282753 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282761 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282768 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282774 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282785 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282819 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282833 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282841 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282848 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282855 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282861 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282868 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282874 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282881 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282887 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282898 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hldzt" event={"ID":"e7fe508f-1e8c-4da7-8f99-108e73cb3791","Type":"ContainerDied","Data":"8359c5871ee1aee2d63af5dec0cce97a0b6622d7bd312c2093b490d8e6067659"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282909 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282918 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282925 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282954 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282963 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282970 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282977 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282984 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282991 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282999 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.282757 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.286514 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/2.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288470 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/1.log" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288549 4791 generic.go:334] "Generic (PLEG): container finished" podID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" containerID="583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea" exitCode=2 Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288580 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerDied","Data":"583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288604 4791 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c"} Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288946 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-netd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.288977 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovn-node-metrics-cert\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289029 4791 scope.go:117] "RemoveContainer" containerID="583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289029 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-systemd-units\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289194 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-netns\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289347 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.289352 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-299s7_openshift-multus(1104c109-74aa-4fc4-8a1b-914a0d5803a4)\"" pod="openshift-multus/multus-299s7" podUID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289386 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-log-socket\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289485 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-node-log\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289529 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289575 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-etc-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289613 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-kubelet\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289634 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-config\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289655 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289677 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-script-lib\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289788 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-systemd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289835 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-slash\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289921 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-ovn\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.289961 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-env-overrides\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290021 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnddl\" (UniqueName: \"kubernetes.io/projected/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-kube-api-access-wnddl\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290054 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-bin\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290075 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-var-lib-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290197 4791 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290220 4791 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290232 4791 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290243 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r26vg\" (UniqueName: \"kubernetes.io/projected/e7fe508f-1e8c-4da7-8f99-108e73cb3791-kube-api-access-r26vg\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290281 4791 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290293 4791 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290303 4791 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290314 4791 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290325 4791 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290359 4791 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290372 4791 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290383 4791 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290393 4791 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290404 4791 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290439 4791 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fe508f-1e8c-4da7-8f99-108e73cb3791-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.290452 4791 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fe508f-1e8c-4da7-8f99-108e73cb3791-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.310724 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hldzt"] Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.315021 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hldzt"] Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.317863 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.360832 4791 scope.go:117] "RemoveContainer" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.380996 4791 scope.go:117] "RemoveContainer" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391688 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-ovn\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391718 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-env-overrides\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391738 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnddl\" (UniqueName: \"kubernetes.io/projected/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-kube-api-access-wnddl\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391752 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-bin\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391768 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-var-lib-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391793 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-netd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391809 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovn-node-metrics-cert\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391824 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-systemd-units\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391838 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-netns\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391855 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391871 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-log-socket\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391892 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-node-log\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391912 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391932 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-etc-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391949 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-kubelet\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391964 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-config\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.391982 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392003 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-script-lib\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392034 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-systemd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392070 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-slash\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392169 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-slash\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392376 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392466 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-netd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392535 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-var-lib-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392528 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-node-log\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392503 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392959 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-ovn\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392644 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-run-systemd\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392678 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-kubelet\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392666 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-etc-openvswitch\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392713 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-systemd-units\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392718 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-netns\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392735 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-cni-bin\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.392627 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.393085 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-config\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.393155 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-log-socket\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.393505 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-env-overrides\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.394731 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovnkube-script-lib\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.399715 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-ovn-node-metrics-cert\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.406337 4791 scope.go:117] "RemoveContainer" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.408594 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnddl\" (UniqueName: \"kubernetes.io/projected/b4f2b02f-cfc0-42a9-832d-adb0268cc26d-kube-api-access-wnddl\") pod \"ovnkube-node-6mh9d\" (UID: \"b4f2b02f-cfc0-42a9-832d-adb0268cc26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.447407 4791 scope.go:117] "RemoveContainer" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.466358 4791 scope.go:117] "RemoveContainer" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.468343 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.484209 4791 scope.go:117] "RemoveContainer" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.499976 4791 scope.go:117] "RemoveContainer" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.519295 4791 scope.go:117] "RemoveContainer" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.535026 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.535513 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.535550 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} err="failed to get container status \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.535579 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.535849 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": container with ID starting with b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644 not found: ID does not exist" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.535876 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} err="failed to get container status \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": rpc error: code = NotFound desc = could not find container \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": container with ID starting with b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.535894 4791 scope.go:117] "RemoveContainer" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.536457 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": container with ID starting with ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3 not found: ID does not exist" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.536502 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} err="failed to get container status \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": rpc error: code = NotFound desc = could not find container \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": container with ID starting with ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.536530 4791 scope.go:117] "RemoveContainer" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.536844 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": container with ID starting with 605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6 not found: ID does not exist" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.536888 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} err="failed to get container status \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": rpc error: code = NotFound desc = could not find container \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": container with ID starting with 605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.536915 4791 scope.go:117] "RemoveContainer" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.537404 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": container with ID starting with 12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1 not found: ID does not exist" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.537429 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} err="failed to get container status \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": rpc error: code = NotFound desc = could not find container \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": container with ID starting with 12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.537445 4791 scope.go:117] "RemoveContainer" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.537699 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": container with ID starting with 0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55 not found: ID does not exist" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.537739 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} err="failed to get container status \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": rpc error: code = NotFound desc = could not find container \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": container with ID starting with 0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.537764 4791 scope.go:117] "RemoveContainer" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.538046 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": container with ID starting with 74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87 not found: ID does not exist" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538071 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} err="failed to get container status \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": rpc error: code = NotFound desc = could not find container \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": container with ID starting with 74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538130 4791 scope.go:117] "RemoveContainer" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.538378 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": container with ID starting with e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e not found: ID does not exist" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538441 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} err="failed to get container status \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": rpc error: code = NotFound desc = could not find container \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": container with ID starting with e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538467 4791 scope.go:117] "RemoveContainer" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.538817 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": container with ID starting with 142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932 not found: ID does not exist" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538853 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} err="failed to get container status \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": rpc error: code = NotFound desc = could not find container \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": container with ID starting with 142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.538880 4791 scope.go:117] "RemoveContainer" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: E0217 00:15:07.539370 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": container with ID starting with 880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6 not found: ID does not exist" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.539402 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} err="failed to get container status \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": rpc error: code = NotFound desc = could not find container \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": container with ID starting with 880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.539436 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.539694 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} err="failed to get container status \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.539721 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.540085 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} err="failed to get container status \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": rpc error: code = NotFound desc = could not find container \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": container with ID starting with b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.540152 4791 scope.go:117] "RemoveContainer" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.540485 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} err="failed to get container status \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": rpc error: code = NotFound desc = could not find container \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": container with ID starting with ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.540545 4791 scope.go:117] "RemoveContainer" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.541205 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} err="failed to get container status \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": rpc error: code = NotFound desc = could not find container \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": container with ID starting with 605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.541240 4791 scope.go:117] "RemoveContainer" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.541477 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} err="failed to get container status \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": rpc error: code = NotFound desc = could not find container \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": container with ID starting with 12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.541501 4791 scope.go:117] "RemoveContainer" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.543624 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} err="failed to get container status \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": rpc error: code = NotFound desc = could not find container \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": container with ID starting with 0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.543664 4791 scope.go:117] "RemoveContainer" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.543994 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} err="failed to get container status \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": rpc error: code = NotFound desc = could not find container \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": container with ID starting with 74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.544033 4791 scope.go:117] "RemoveContainer" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.544429 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} err="failed to get container status \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": rpc error: code = NotFound desc = could not find container \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": container with ID starting with e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.544460 4791 scope.go:117] "RemoveContainer" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.544791 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} err="failed to get container status \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": rpc error: code = NotFound desc = could not find container \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": container with ID starting with 142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.544817 4791 scope.go:117] "RemoveContainer" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545143 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} err="failed to get container status \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": rpc error: code = NotFound desc = could not find container \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": container with ID starting with 880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545174 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545492 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} err="failed to get container status \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545518 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545819 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} err="failed to get container status \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": rpc error: code = NotFound desc = could not find container \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": container with ID starting with b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.545851 4791 scope.go:117] "RemoveContainer" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546285 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} err="failed to get container status \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": rpc error: code = NotFound desc = could not find container \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": container with ID starting with ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546309 4791 scope.go:117] "RemoveContainer" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546606 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} err="failed to get container status \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": rpc error: code = NotFound desc = could not find container \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": container with ID starting with 605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546637 4791 scope.go:117] "RemoveContainer" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546896 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} err="failed to get container status \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": rpc error: code = NotFound desc = could not find container \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": container with ID starting with 12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.546924 4791 scope.go:117] "RemoveContainer" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.547312 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} err="failed to get container status \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": rpc error: code = NotFound desc = could not find container \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": container with ID starting with 0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.547349 4791 scope.go:117] "RemoveContainer" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.547863 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} err="failed to get container status \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": rpc error: code = NotFound desc = could not find container \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": container with ID starting with 74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.547891 4791 scope.go:117] "RemoveContainer" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548183 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} err="failed to get container status \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": rpc error: code = NotFound desc = could not find container \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": container with ID starting with e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548213 4791 scope.go:117] "RemoveContainer" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548575 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} err="failed to get container status \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": rpc error: code = NotFound desc = could not find container \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": container with ID starting with 142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548603 4791 scope.go:117] "RemoveContainer" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548857 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} err="failed to get container status \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": rpc error: code = NotFound desc = could not find container \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": container with ID starting with 880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.548886 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549191 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} err="failed to get container status \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549221 4791 scope.go:117] "RemoveContainer" containerID="b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549507 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644"} err="failed to get container status \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": rpc error: code = NotFound desc = could not find container \"b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644\": container with ID starting with b5e7e7512b2366ca9c13ac459145f34fde3958ac28a26d7ee64ed4f1d747a644 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549534 4791 scope.go:117] "RemoveContainer" containerID="ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549780 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3"} err="failed to get container status \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": rpc error: code = NotFound desc = could not find container \"ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3\": container with ID starting with ca784df3a5eec3bca4599f48be6daba940d0369391f2ead6ce9567f05b70f4c3 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.549807 4791 scope.go:117] "RemoveContainer" containerID="605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550176 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6"} err="failed to get container status \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": rpc error: code = NotFound desc = could not find container \"605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6\": container with ID starting with 605b9b4373ebf47a79a0a0883bbd14bc0bfea006d649e33b7042bbaa478298c6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550200 4791 scope.go:117] "RemoveContainer" containerID="12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550563 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1"} err="failed to get container status \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": rpc error: code = NotFound desc = could not find container \"12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1\": container with ID starting with 12a80e92ac0c940f3191afaeaa592bf2f2b30a488503f52022a6fdd0449de2e1 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550609 4791 scope.go:117] "RemoveContainer" containerID="0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550924 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55"} err="failed to get container status \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": rpc error: code = NotFound desc = could not find container \"0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55\": container with ID starting with 0ea5f34fcd4f5f121a5c68c854fc5bcdd4a7f1ef32f34f6e5e2fd3eb2c242a55 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.550955 4791 scope.go:117] "RemoveContainer" containerID="74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.551356 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87"} err="failed to get container status \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": rpc error: code = NotFound desc = could not find container \"74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87\": container with ID starting with 74c4f4302d87ebc67603aaca9ba96c3dbb49c706bff9376e82eb0103af77ec87 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.551389 4791 scope.go:117] "RemoveContainer" containerID="e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.551711 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e"} err="failed to get container status \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": rpc error: code = NotFound desc = could not find container \"e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e\": container with ID starting with e9a2d8f96c949bddbd9a446c5f03baa93b3d321eb9afbae40096f308a7bf476e not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.551739 4791 scope.go:117] "RemoveContainer" containerID="142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.552078 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932"} err="failed to get container status \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": rpc error: code = NotFound desc = could not find container \"142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932\": container with ID starting with 142b6b49ace78109befbcfcd66fea185e387536a60511d06229f495388f3f932 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.552127 4791 scope.go:117] "RemoveContainer" containerID="880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.552427 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6"} err="failed to get container status \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": rpc error: code = NotFound desc = could not find container \"880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6\": container with ID starting with 880c5ffe17fb04d31b5ca40c40216dddd776502254420eebfb4c7ed6b113c7f6 not found: ID does not exist" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.552455 4791 scope.go:117] "RemoveContainer" containerID="ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf" Feb 17 00:15:07 crc kubenswrapper[4791]: I0217 00:15:07.552691 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf"} err="failed to get container status \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": rpc error: code = NotFound desc = could not find container \"ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf\": container with ID starting with ace9e7bbc42ad11865e3cabe40999a07ee61cf9aab5e2b4427219c33fa061ddf not found: ID does not exist" Feb 17 00:15:08 crc kubenswrapper[4791]: I0217 00:15:08.298086 4791 generic.go:334] "Generic (PLEG): container finished" podID="b4f2b02f-cfc0-42a9-832d-adb0268cc26d" containerID="585bacda4be2074c95265d66c30093285f0e868bffc068a4d3024a00f52106f7" exitCode=0 Feb 17 00:15:08 crc kubenswrapper[4791]: I0217 00:15:08.298150 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerDied","Data":"585bacda4be2074c95265d66c30093285f0e868bffc068a4d3024a00f52106f7"} Feb 17 00:15:08 crc kubenswrapper[4791]: I0217 00:15:08.298458 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"512468c2f81ce78eb3c74943d6bcb5168342c22846b822ee264278c60f5d33cd"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.230284 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7fe508f-1e8c-4da7-8f99-108e73cb3791" path="/var/lib/kubelet/pods/e7fe508f-1e8c-4da7-8f99-108e73cb3791/volumes" Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311700 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"90e65f8c4449195dd5f2d09a2fc95b311cfbdd8f7490988154f053dff6c96d25"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311748 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"7a766979695ee77b4349b273f68f6946cf77a3df27993971861adc5c90b2150d"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311766 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"4321bb07633317ea5c5d2c079e48ffe11aea22b184cb12bf4465c6d02470e055"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311776 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"77b859ce02b9ed54f03c44b5ba8a45e0fd01d5247eecd158ae4b0652397f5b8b"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311785 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"f85e14266e55e68e5b052b3fd0448e23cf9430ef8d3abc131ced675b9545599a"} Feb 17 00:15:09 crc kubenswrapper[4791]: I0217 00:15:09.311795 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"4fd8cb0e6ae478527d55bfae465a834cfd420e5c83378db28477e6272c75336b"} Feb 17 00:15:12 crc kubenswrapper[4791]: I0217 00:15:12.339193 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"8b01a32a7b8802fec40cdd1a91fddabe6819202e7f25b363c9ee688843e5ddc1"} Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.352751 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" event={"ID":"b4f2b02f-cfc0-42a9-832d-adb0268cc26d","Type":"ContainerStarted","Data":"1210297e17828fcc2f04ee549013a7ffa2a2a0b7da6fbbbf1801ccc000d6e576"} Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.353085 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.353265 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.353278 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.387418 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" podStartSLOduration=7.387403271 podStartE2EDuration="7.387403271s" podCreationTimestamp="2026-02-17 00:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:15:14.386781911 +0000 UTC m=+571.866294438" watchObservedRunningTime="2026-02-17 00:15:14.387403271 +0000 UTC m=+571.866915798" Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.405907 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:14 crc kubenswrapper[4791]: I0217 00:15:14.407765 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:20 crc kubenswrapper[4791]: I0217 00:15:20.221168 4791 scope.go:117] "RemoveContainer" containerID="583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea" Feb 17 00:15:20 crc kubenswrapper[4791]: E0217 00:15:20.221875 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-299s7_openshift-multus(1104c109-74aa-4fc4-8a1b-914a0d5803a4)\"" pod="openshift-multus/multus-299s7" podUID="1104c109-74aa-4fc4-8a1b-914a0d5803a4" Feb 17 00:15:24 crc kubenswrapper[4791]: I0217 00:15:24.973048 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:15:24 crc kubenswrapper[4791]: I0217 00:15:24.974449 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:15:33 crc kubenswrapper[4791]: I0217 00:15:33.222748 4791 scope.go:117] "RemoveContainer" containerID="583de3e38aa471160a347992af3add013ed63c250a69e768895a5ca5ef559bea" Feb 17 00:15:33 crc kubenswrapper[4791]: I0217 00:15:33.486269 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/2.log" Feb 17 00:15:33 crc kubenswrapper[4791]: I0217 00:15:33.487524 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/1.log" Feb 17 00:15:33 crc kubenswrapper[4791]: I0217 00:15:33.487615 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-299s7" event={"ID":"1104c109-74aa-4fc4-8a1b-914a0d5803a4","Type":"ContainerStarted","Data":"e2e684b560a24deca25b7db8c32d2dceb7734559d94b660c79340702f0e6bb29"} Feb 17 00:15:37 crc kubenswrapper[4791]: I0217 00:15:37.500171 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mh9d" Feb 17 00:15:43 crc kubenswrapper[4791]: I0217 00:15:43.513991 4791 scope.go:117] "RemoveContainer" containerID="6db675f531aa30ad1f83255bf72a659dd7e60aaecdd681515208407c414c903c" Feb 17 00:15:44 crc kubenswrapper[4791]: I0217 00:15:44.558657 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-299s7_1104c109-74aa-4fc4-8a1b-914a0d5803a4/kube-multus/2.log" Feb 17 00:15:54 crc kubenswrapper[4791]: I0217 00:15:54.972643 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:15:54 crc kubenswrapper[4791]: I0217 00:15:54.973024 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:16:07 crc kubenswrapper[4791]: I0217 00:16:07.693093 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:16:07 crc kubenswrapper[4791]: I0217 00:16:07.694226 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lbgxw" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="registry-server" containerID="cri-o://c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751" gracePeriod=30 Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.032892 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.161875 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content\") pod \"ce989914-c2c6-4717-9acb-161dd734b4f6\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.161943 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz657\" (UniqueName: \"kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657\") pod \"ce989914-c2c6-4717-9acb-161dd734b4f6\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.162021 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities\") pod \"ce989914-c2c6-4717-9acb-161dd734b4f6\" (UID: \"ce989914-c2c6-4717-9acb-161dd734b4f6\") " Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.163597 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities" (OuterVolumeSpecName: "utilities") pod "ce989914-c2c6-4717-9acb-161dd734b4f6" (UID: "ce989914-c2c6-4717-9acb-161dd734b4f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.168596 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657" (OuterVolumeSpecName: "kube-api-access-rz657") pod "ce989914-c2c6-4717-9acb-161dd734b4f6" (UID: "ce989914-c2c6-4717-9acb-161dd734b4f6"). InnerVolumeSpecName "kube-api-access-rz657". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.194379 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce989914-c2c6-4717-9acb-161dd734b4f6" (UID: "ce989914-c2c6-4717-9acb-161dd734b4f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.264100 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.264168 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz657\" (UniqueName: \"kubernetes.io/projected/ce989914-c2c6-4717-9acb-161dd734b4f6-kube-api-access-rz657\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.264184 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce989914-c2c6-4717-9acb-161dd734b4f6-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.727875 4791 generic.go:334] "Generic (PLEG): container finished" podID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerID="c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751" exitCode=0 Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.727931 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerDied","Data":"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751"} Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.727955 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbgxw" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.727982 4791 scope.go:117] "RemoveContainer" containerID="c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.727966 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbgxw" event={"ID":"ce989914-c2c6-4717-9acb-161dd734b4f6","Type":"ContainerDied","Data":"0c0c0ef37f45961765809fc7c0c9b4244d7c69f4387e7e7fe8ce8a7787ea122a"} Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.746166 4791 scope.go:117] "RemoveContainer" containerID="e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.760540 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.764348 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbgxw"] Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.787530 4791 scope.go:117] "RemoveContainer" containerID="3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.803711 4791 scope.go:117] "RemoveContainer" containerID="c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751" Feb 17 00:16:08 crc kubenswrapper[4791]: E0217 00:16:08.804349 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751\": container with ID starting with c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751 not found: ID does not exist" containerID="c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.804397 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751"} err="failed to get container status \"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751\": rpc error: code = NotFound desc = could not find container \"c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751\": container with ID starting with c1899b35ee23af37b31e777d967d615f8c6498d7792dff26cf2154a7a42f8751 not found: ID does not exist" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.804429 4791 scope.go:117] "RemoveContainer" containerID="e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2" Feb 17 00:16:08 crc kubenswrapper[4791]: E0217 00:16:08.804884 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2\": container with ID starting with e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2 not found: ID does not exist" containerID="e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.804922 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2"} err="failed to get container status \"e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2\": rpc error: code = NotFound desc = could not find container \"e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2\": container with ID starting with e6388094dea6e5f3fc85b3f7c61dbf221cb823d36d986f1dd74b4e94b0c276b2 not found: ID does not exist" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.804950 4791 scope.go:117] "RemoveContainer" containerID="3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195" Feb 17 00:16:08 crc kubenswrapper[4791]: E0217 00:16:08.805402 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195\": container with ID starting with 3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195 not found: ID does not exist" containerID="3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195" Feb 17 00:16:08 crc kubenswrapper[4791]: I0217 00:16:08.805441 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195"} err="failed to get container status \"3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195\": rpc error: code = NotFound desc = could not find container \"3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195\": container with ID starting with 3a4f0fbfdaf9197d1c655a275dbc9b73975f167cf4abd10a9fd33aab398c4195 not found: ID does not exist" Feb 17 00:16:09 crc kubenswrapper[4791]: I0217 00:16:09.234578 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" path="/var/lib/kubelet/pods/ce989914-c2c6-4717-9acb-161dd734b4f6/volumes" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.511047 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl"] Feb 17 00:16:11 crc kubenswrapper[4791]: E0217 00:16:11.511302 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="extract-utilities" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.511317 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="extract-utilities" Feb 17 00:16:11 crc kubenswrapper[4791]: E0217 00:16:11.511329 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="registry-server" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.511337 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="registry-server" Feb 17 00:16:11 crc kubenswrapper[4791]: E0217 00:16:11.511349 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="extract-content" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.511357 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="extract-content" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.511469 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce989914-c2c6-4717-9acb-161dd734b4f6" containerName="registry-server" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.512376 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.517632 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.537072 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl"] Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.600410 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcnq5\" (UniqueName: \"kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.600475 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.600510 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.701690 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.701788 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcnq5\" (UniqueName: \"kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.701842 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.702280 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.702402 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.723287 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcnq5\" (UniqueName: \"kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:11 crc kubenswrapper[4791]: I0217 00:16:11.828913 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:12 crc kubenswrapper[4791]: I0217 00:16:12.081410 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl"] Feb 17 00:16:12 crc kubenswrapper[4791]: I0217 00:16:12.752999 4791 generic.go:334] "Generic (PLEG): container finished" podID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerID="2f5d3f8df0535fd228835286243796212d5a510efdc64603f9e1d050615d4714" exitCode=0 Feb 17 00:16:12 crc kubenswrapper[4791]: I0217 00:16:12.753046 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" event={"ID":"306a7321-68e3-4f13-95d0-3c3dbee8b24f","Type":"ContainerDied","Data":"2f5d3f8df0535fd228835286243796212d5a510efdc64603f9e1d050615d4714"} Feb 17 00:16:12 crc kubenswrapper[4791]: I0217 00:16:12.753075 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" event={"ID":"306a7321-68e3-4f13-95d0-3c3dbee8b24f","Type":"ContainerStarted","Data":"0869a21a026a58cea076884833871d1ce8ea04be88334ba3e5be2837d4fb535f"} Feb 17 00:16:12 crc kubenswrapper[4791]: I0217 00:16:12.758472 4791 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 00:16:14 crc kubenswrapper[4791]: I0217 00:16:14.770940 4791 generic.go:334] "Generic (PLEG): container finished" podID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerID="3e95f87bb062f252217212cffa73bf3194cc09b5136dec0e11fa71b8ff76fb22" exitCode=0 Feb 17 00:16:14 crc kubenswrapper[4791]: I0217 00:16:14.771050 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" event={"ID":"306a7321-68e3-4f13-95d0-3c3dbee8b24f","Type":"ContainerDied","Data":"3e95f87bb062f252217212cffa73bf3194cc09b5136dec0e11fa71b8ff76fb22"} Feb 17 00:16:15 crc kubenswrapper[4791]: I0217 00:16:15.784824 4791 generic.go:334] "Generic (PLEG): container finished" podID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerID="ea53282404244d4b2707e959ca82bee2f7679d571902217b5dd1d3b672e7ad12" exitCode=0 Feb 17 00:16:15 crc kubenswrapper[4791]: I0217 00:16:15.784917 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" event={"ID":"306a7321-68e3-4f13-95d0-3c3dbee8b24f","Type":"ContainerDied","Data":"ea53282404244d4b2707e959ca82bee2f7679d571902217b5dd1d3b672e7ad12"} Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.105656 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.178198 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util\") pod \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.178561 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle\") pod \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.178619 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcnq5\" (UniqueName: \"kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5\") pod \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\" (UID: \"306a7321-68e3-4f13-95d0-3c3dbee8b24f\") " Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.192564 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5" (OuterVolumeSpecName: "kube-api-access-qcnq5") pod "306a7321-68e3-4f13-95d0-3c3dbee8b24f" (UID: "306a7321-68e3-4f13-95d0-3c3dbee8b24f"). InnerVolumeSpecName "kube-api-access-qcnq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.196164 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle" (OuterVolumeSpecName: "bundle") pod "306a7321-68e3-4f13-95d0-3c3dbee8b24f" (UID: "306a7321-68e3-4f13-95d0-3c3dbee8b24f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.215455 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util" (OuterVolumeSpecName: "util") pod "306a7321-68e3-4f13-95d0-3c3dbee8b24f" (UID: "306a7321-68e3-4f13-95d0-3c3dbee8b24f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.280464 4791 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-util\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.280520 4791 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a7321-68e3-4f13-95d0-3c3dbee8b24f-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.280539 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcnq5\" (UniqueName: \"kubernetes.io/projected/306a7321-68e3-4f13-95d0-3c3dbee8b24f-kube-api-access-qcnq5\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.485066 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r"] Feb 17 00:16:17 crc kubenswrapper[4791]: E0217 00:16:17.485388 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="pull" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.485410 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="pull" Feb 17 00:16:17 crc kubenswrapper[4791]: E0217 00:16:17.485423 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="extract" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.485433 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="extract" Feb 17 00:16:17 crc kubenswrapper[4791]: E0217 00:16:17.485457 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="util" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.485469 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="util" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.485636 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="306a7321-68e3-4f13-95d0-3c3dbee8b24f" containerName="extract" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.487220 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.503404 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r"] Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.584126 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.584410 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhrzd\" (UniqueName: \"kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.584500 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.686346 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhrzd\" (UniqueName: \"kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.686439 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.686513 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.687397 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.687501 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.717227 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhrzd\" (UniqueName: \"kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.802088 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" event={"ID":"306a7321-68e3-4f13-95d0-3c3dbee8b24f","Type":"ContainerDied","Data":"0869a21a026a58cea076884833871d1ce8ea04be88334ba3e5be2837d4fb535f"} Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.802185 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0869a21a026a58cea076884833871d1ce8ea04be88334ba3e5be2837d4fb535f" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.802244 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl" Feb 17 00:16:17 crc kubenswrapper[4791]: I0217 00:16:17.816499 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.056790 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r"] Feb 17 00:16:18 crc kubenswrapper[4791]: W0217 00:16:18.065079 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfce146a_61fa_4821_ab43_8fd35dc5fe07.slice/crio-4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1 WatchSource:0}: Error finding container 4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1: Status 404 returned error can't find the container with id 4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1 Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.262876 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb"] Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.264407 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.275178 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb"] Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.395648 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.395726 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.395798 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjfw\" (UniqueName: \"kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.496889 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjfw\" (UniqueName: \"kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.497471 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.497777 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.498383 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.498453 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.526290 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjfw\" (UniqueName: \"kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.580484 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.810903 4791 generic.go:334] "Generic (PLEG): container finished" podID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerID="78490f2b3615a7ddb7efd4af6351c0526b6b3fa122e398e922bf6a6ec7a152b3" exitCode=0 Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.810995 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" event={"ID":"bfce146a-61fa-4821-ab43-8fd35dc5fe07","Type":"ContainerDied","Data":"78490f2b3615a7ddb7efd4af6351c0526b6b3fa122e398e922bf6a6ec7a152b3"} Feb 17 00:16:18 crc kubenswrapper[4791]: I0217 00:16:18.811322 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" event={"ID":"bfce146a-61fa-4821-ab43-8fd35dc5fe07","Type":"ContainerStarted","Data":"4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1"} Feb 17 00:16:19 crc kubenswrapper[4791]: I0217 00:16:19.050361 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb"] Feb 17 00:16:19 crc kubenswrapper[4791]: I0217 00:16:19.828477 4791 generic.go:334] "Generic (PLEG): container finished" podID="f156dae1-1d4a-47b3-835e-016325f1981c" containerID="68ef460952e68de60b1a58d94e99269c6162da95c7114adeff4d4e552688fd57" exitCode=0 Feb 17 00:16:19 crc kubenswrapper[4791]: I0217 00:16:19.828576 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" event={"ID":"f156dae1-1d4a-47b3-835e-016325f1981c","Type":"ContainerDied","Data":"68ef460952e68de60b1a58d94e99269c6162da95c7114adeff4d4e552688fd57"} Feb 17 00:16:19 crc kubenswrapper[4791]: I0217 00:16:19.828831 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" event={"ID":"f156dae1-1d4a-47b3-835e-016325f1981c","Type":"ContainerStarted","Data":"43ac63093f7b6c93cf4fb80892a768e3f1ded2f4bd27836925fe1e0e4db43b6b"} Feb 17 00:16:20 crc kubenswrapper[4791]: I0217 00:16:20.837227 4791 generic.go:334] "Generic (PLEG): container finished" podID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerID="70da6909dff98f6a15c3e6d61c22d2123946b8b9e42b29c7114857574709a440" exitCode=0 Feb 17 00:16:20 crc kubenswrapper[4791]: I0217 00:16:20.837325 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" event={"ID":"bfce146a-61fa-4821-ab43-8fd35dc5fe07","Type":"ContainerDied","Data":"70da6909dff98f6a15c3e6d61c22d2123946b8b9e42b29c7114857574709a440"} Feb 17 00:16:20 crc kubenswrapper[4791]: I0217 00:16:20.839934 4791 generic.go:334] "Generic (PLEG): container finished" podID="f156dae1-1d4a-47b3-835e-016325f1981c" containerID="1af4cb3c07106a022003314ab268e8d7fd6f96516f42b97a6d5809a8d5ce3225" exitCode=0 Feb 17 00:16:20 crc kubenswrapper[4791]: I0217 00:16:20.839988 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" event={"ID":"f156dae1-1d4a-47b3-835e-016325f1981c","Type":"ContainerDied","Data":"1af4cb3c07106a022003314ab268e8d7fd6f96516f42b97a6d5809a8d5ce3225"} Feb 17 00:16:21 crc kubenswrapper[4791]: I0217 00:16:21.844990 4791 generic.go:334] "Generic (PLEG): container finished" podID="f156dae1-1d4a-47b3-835e-016325f1981c" containerID="44a4a5ae5c2f4f7f3cfac080a77f559de7e938d185002fad8d70a24cb2d0a5ee" exitCode=0 Feb 17 00:16:21 crc kubenswrapper[4791]: I0217 00:16:21.845092 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" event={"ID":"f156dae1-1d4a-47b3-835e-016325f1981c","Type":"ContainerDied","Data":"44a4a5ae5c2f4f7f3cfac080a77f559de7e938d185002fad8d70a24cb2d0a5ee"} Feb 17 00:16:21 crc kubenswrapper[4791]: I0217 00:16:21.846984 4791 generic.go:334] "Generic (PLEG): container finished" podID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerID="84cf57ff591c5b3543988d83f4058aa23d61fd03b9c193df5a592ba5158f2116" exitCode=0 Feb 17 00:16:21 crc kubenswrapper[4791]: I0217 00:16:21.847025 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" event={"ID":"bfce146a-61fa-4821-ab43-8fd35dc5fe07","Type":"ContainerDied","Data":"84cf57ff591c5b3543988d83f4058aa23d61fd03b9c193df5a592ba5158f2116"} Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.258497 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.280474 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377233 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util\") pod \"f156dae1-1d4a-47b3-835e-016325f1981c\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377299 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhrzd\" (UniqueName: \"kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd\") pod \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377325 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle\") pod \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377346 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle\") pod \"f156dae1-1d4a-47b3-835e-016325f1981c\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377407 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqjfw\" (UniqueName: \"kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw\") pod \"f156dae1-1d4a-47b3-835e-016325f1981c\" (UID: \"f156dae1-1d4a-47b3-835e-016325f1981c\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377439 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util\") pod \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\" (UID: \"bfce146a-61fa-4821-ab43-8fd35dc5fe07\") " Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.377914 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle" (OuterVolumeSpecName: "bundle") pod "bfce146a-61fa-4821-ab43-8fd35dc5fe07" (UID: "bfce146a-61fa-4821-ab43-8fd35dc5fe07"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.379208 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle" (OuterVolumeSpecName: "bundle") pod "f156dae1-1d4a-47b3-835e-016325f1981c" (UID: "f156dae1-1d4a-47b3-835e-016325f1981c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.384304 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw" (OuterVolumeSpecName: "kube-api-access-rqjfw") pod "f156dae1-1d4a-47b3-835e-016325f1981c" (UID: "f156dae1-1d4a-47b3-835e-016325f1981c"). InnerVolumeSpecName "kube-api-access-rqjfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.394062 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util" (OuterVolumeSpecName: "util") pod "f156dae1-1d4a-47b3-835e-016325f1981c" (UID: "f156dae1-1d4a-47b3-835e-016325f1981c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.395814 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util" (OuterVolumeSpecName: "util") pod "bfce146a-61fa-4821-ab43-8fd35dc5fe07" (UID: "bfce146a-61fa-4821-ab43-8fd35dc5fe07"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.401239 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd" (OuterVolumeSpecName: "kube-api-access-lhrzd") pod "bfce146a-61fa-4821-ab43-8fd35dc5fe07" (UID: "bfce146a-61fa-4821-ab43-8fd35dc5fe07"). InnerVolumeSpecName "kube-api-access-lhrzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478597 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhrzd\" (UniqueName: \"kubernetes.io/projected/bfce146a-61fa-4821-ab43-8fd35dc5fe07-kube-api-access-lhrzd\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478626 4791 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478635 4791 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478643 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqjfw\" (UniqueName: \"kubernetes.io/projected/f156dae1-1d4a-47b3-835e-016325f1981c-kube-api-access-rqjfw\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478651 4791 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfce146a-61fa-4821-ab43-8fd35dc5fe07-util\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.478659 4791 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f156dae1-1d4a-47b3-835e-016325f1981c-util\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.859362 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.859388 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb" event={"ID":"f156dae1-1d4a-47b3-835e-016325f1981c","Type":"ContainerDied","Data":"43ac63093f7b6c93cf4fb80892a768e3f1ded2f4bd27836925fe1e0e4db43b6b"} Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.859427 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43ac63093f7b6c93cf4fb80892a768e3f1ded2f4bd27836925fe1e0e4db43b6b" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.861129 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" event={"ID":"bfce146a-61fa-4821-ab43-8fd35dc5fe07","Type":"ContainerDied","Data":"4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1"} Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.861168 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca4de3d6b02231c967c39cfd48ff6ae6a608b530c4a5c85b60edd49c810d8b1" Feb 17 00:16:23 crc kubenswrapper[4791]: I0217 00:16:23.861280 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r" Feb 17 00:16:24 crc kubenswrapper[4791]: I0217 00:16:24.973520 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:16:24 crc kubenswrapper[4791]: I0217 00:16:24.973604 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:16:24 crc kubenswrapper[4791]: I0217 00:16:24.973672 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:16:24 crc kubenswrapper[4791]: I0217 00:16:24.974523 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:16:24 crc kubenswrapper[4791]: I0217 00:16:24.974644 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4" gracePeriod=600 Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.873579 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4" exitCode=0 Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.873608 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4"} Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.874150 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb"} Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.874176 4791 scope.go:117] "RemoveContainer" containerID="9f0176ad0d56ad2a09e7895c1ba79efcf2efe001559c2c246d6cab82cbcdac2f" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952298 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj"] Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952507 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="util" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952523 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="util" Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952537 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952543 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952553 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="pull" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952558 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="pull" Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952568 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952573 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952582 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="pull" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952588 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="pull" Feb 17 00:16:25 crc kubenswrapper[4791]: E0217 00:16:25.952598 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="util" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952604 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="util" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952703 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f156dae1-1d4a-47b3-835e-016325f1981c" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.952712 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfce146a-61fa-4821-ab43-8fd35dc5fe07" containerName="extract" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.953064 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.955917 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rm4vf" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.956184 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.957473 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.969073 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj"] Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.992592 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5"] Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.993357 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.995058 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 17 00:16:25 crc kubenswrapper[4791]: I0217 00:16:25.997247 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-6m5f5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.006583 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.010230 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5h5l\" (UniqueName: \"kubernetes.io/projected/f73f7b40-6611-465e-ae69-d2f70ce77651-kube-api-access-r5h5l\") pod \"obo-prometheus-operator-68bc856cb9-rw6pj\" (UID: \"f73f7b40-6611-465e-ae69-d2f70ce77651\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.012133 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.012754 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.025581 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.112024 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.112101 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.112143 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.112190 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.112230 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5h5l\" (UniqueName: \"kubernetes.io/projected/f73f7b40-6611-465e-ae69-d2f70ce77651-kube-api-access-r5h5l\") pod \"obo-prometheus-operator-68bc856cb9-rw6pj\" (UID: \"f73f7b40-6611-465e-ae69-d2f70ce77651\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.129406 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5h5l\" (UniqueName: \"kubernetes.io/projected/f73f7b40-6611-465e-ae69-d2f70ce77651-kube-api-access-r5h5l\") pod \"obo-prometheus-operator-68bc856cb9-rw6pj\" (UID: \"f73f7b40-6611-465e-ae69-d2f70ce77651\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.179338 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v2lwp"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.180699 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.195274 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.195466 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-gngdj" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.201452 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v2lwp"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.213026 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.213164 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.213262 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.213374 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.216127 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.216702 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.218377 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c43370f-07b8-4f84-b716-34af90be5850-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5\" (UID: \"8c43370f-07b8-4f84-b716-34af90be5850\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.219152 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b046e97f-6343-4e3f-ae0a-0fb40687d992-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f\" (UID: \"b046e97f-6343-4e3f-ae0a-0fb40687d992\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.277361 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.310840 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.314475 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/307585d5-5ed8-43df-b5d8-977729339610-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.314530 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlmv9\" (UniqueName: \"kubernetes.io/projected/307585d5-5ed8-43df-b5d8-977729339610-kube-api-access-zlmv9\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.324525 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.395398 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-mk4lp"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.396556 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.400696 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-2q4sx" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.416283 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-mk4lp"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.417024 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/307585d5-5ed8-43df-b5d8-977729339610-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.417094 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlmv9\" (UniqueName: \"kubernetes.io/projected/307585d5-5ed8-43df-b5d8-977729339610-kube-api-access-zlmv9\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.434292 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/307585d5-5ed8-43df-b5d8-977729339610-observability-operator-tls\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.441969 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlmv9\" (UniqueName: \"kubernetes.io/projected/307585d5-5ed8-43df-b5d8-977729339610-kube-api-access-zlmv9\") pod \"observability-operator-59bdc8b94-v2lwp\" (UID: \"307585d5-5ed8-43df-b5d8-977729339610\") " pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.470457 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.471419 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.477433 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.482221 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.523251 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4swr\" (UniqueName: \"kubernetes.io/projected/3b110234-d36d-4ced-a2be-7913bbb84d2a-kube-api-access-z4swr\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.523336 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b110234-d36d-4ced-a2be-7913bbb84d2a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.523262 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.624860 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4swr\" (UniqueName: \"kubernetes.io/projected/3b110234-d36d-4ced-a2be-7913bbb84d2a-kube-api-access-z4swr\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.625383 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.625410 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.625466 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b110234-d36d-4ced-a2be-7913bbb84d2a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.625615 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2m9m\" (UniqueName: \"kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.626737 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b110234-d36d-4ced-a2be-7913bbb84d2a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.644970 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4swr\" (UniqueName: \"kubernetes.io/projected/3b110234-d36d-4ced-a2be-7913bbb84d2a-kube-api-access-z4swr\") pod \"perses-operator-5bf474d74f-mk4lp\" (UID: \"3b110234-d36d-4ced-a2be-7913bbb84d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.726311 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2m9m\" (UniqueName: \"kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.726387 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.726418 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.726837 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.726938 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.742814 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2m9m\" (UniqueName: \"kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.750029 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.775422 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj"] Feb 17 00:16:26 crc kubenswrapper[4791]: W0217 00:16:26.784933 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf73f7b40_6611_465e_ae69_d2f70ce77651.slice/crio-9776249e75c7322d3271cc3a06e82656659fc6ffea4e5b1adcea27d5b6735971 WatchSource:0}: Error finding container 9776249e75c7322d3271cc3a06e82656659fc6ffea4e5b1adcea27d5b6735971: Status 404 returned error can't find the container with id 9776249e75c7322d3271cc3a06e82656659fc6ffea4e5b1adcea27d5b6735971 Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.826523 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.856146 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f"] Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.866177 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5"] Feb 17 00:16:26 crc kubenswrapper[4791]: W0217 00:16:26.879091 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb046e97f_6343_4e3f_ae0a_0fb40687d992.slice/crio-5bcb3d7b025d938474ab7ed02a2f5e637b5fbbd4db988c189618bba5f0df3570 WatchSource:0}: Error finding container 5bcb3d7b025d938474ab7ed02a2f5e637b5fbbd4db988c189618bba5f0df3570: Status 404 returned error can't find the container with id 5bcb3d7b025d938474ab7ed02a2f5e637b5fbbd4db988c189618bba5f0df3570 Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.894011 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" event={"ID":"f73f7b40-6611-465e-ae69-d2f70ce77651","Type":"ContainerStarted","Data":"9776249e75c7322d3271cc3a06e82656659fc6ffea4e5b1adcea27d5b6735971"} Feb 17 00:16:26 crc kubenswrapper[4791]: W0217 00:16:26.900271 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c43370f_07b8_4f84_b716_34af90be5850.slice/crio-9d61bd01e2adcc639d1e246263bdf0b1f0f730e741c8c87213478e2ef2d6cde3 WatchSource:0}: Error finding container 9d61bd01e2adcc639d1e246263bdf0b1f0f730e741c8c87213478e2ef2d6cde3: Status 404 returned error can't find the container with id 9d61bd01e2adcc639d1e246263bdf0b1f0f730e741c8c87213478e2ef2d6cde3 Feb 17 00:16:26 crc kubenswrapper[4791]: I0217 00:16:26.995804 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-mk4lp"] Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.023679 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-v2lwp"] Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.093044 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm"] Feb 17 00:16:27 crc kubenswrapper[4791]: W0217 00:16:27.114707 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba03d7dd_7e00_4b21_a86b_a2cabeb36ed9.slice/crio-d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8 WatchSource:0}: Error finding container d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8: Status 404 returned error can't find the container with id d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8 Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.901966 4791 generic.go:334] "Generic (PLEG): container finished" podID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerID="b48607b7d89d41c7ceb2f3bd92cc56bbed0f0f6540297a05fb55091429fcd5da" exitCode=0 Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.902091 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" event={"ID":"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9","Type":"ContainerDied","Data":"b48607b7d89d41c7ceb2f3bd92cc56bbed0f0f6540297a05fb55091429fcd5da"} Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.902371 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" event={"ID":"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9","Type":"ContainerStarted","Data":"d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8"} Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.904374 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" event={"ID":"b046e97f-6343-4e3f-ae0a-0fb40687d992","Type":"ContainerStarted","Data":"5bcb3d7b025d938474ab7ed02a2f5e637b5fbbd4db988c189618bba5f0df3570"} Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.905780 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" event={"ID":"307585d5-5ed8-43df-b5d8-977729339610","Type":"ContainerStarted","Data":"dd98b123aef2a50077000000d95f9b2e8173ce13179a87aa091eaff166d7f997"} Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.907989 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" event={"ID":"8c43370f-07b8-4f84-b716-34af90be5850","Type":"ContainerStarted","Data":"9d61bd01e2adcc639d1e246263bdf0b1f0f730e741c8c87213478e2ef2d6cde3"} Feb 17 00:16:27 crc kubenswrapper[4791]: I0217 00:16:27.909984 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" event={"ID":"3b110234-d36d-4ced-a2be-7913bbb84d2a","Type":"ContainerStarted","Data":"7e21575c377cff335ed8fe1114a7af3a0c188c3c4805ddaa4207691a89b4097d"} Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.657315 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-d5d58ff4c-lwcwp"] Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.659463 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.664013 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.664397 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.664607 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-6pnfb" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.665509 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.673783 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-d5d58ff4c-lwcwp"] Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.736732 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-apiservice-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.736825 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-webhook-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.736900 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cswvd\" (UniqueName: \"kubernetes.io/projected/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-kube-api-access-cswvd\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.838090 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-apiservice-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.838191 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-webhook-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.839063 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cswvd\" (UniqueName: \"kubernetes.io/projected/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-kube-api-access-cswvd\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.844650 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-webhook-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.844792 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-apiservice-cert\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.854235 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cswvd\" (UniqueName: \"kubernetes.io/projected/9d24e1c4-bdcf-4ffa-8138-b1fb47410471-kube-api-access-cswvd\") pod \"elastic-operator-d5d58ff4c-lwcwp\" (UID: \"9d24e1c4-bdcf-4ffa-8138-b1fb47410471\") " pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:33 crc kubenswrapper[4791]: I0217 00:16:33.984877 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.355507 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-8j89k"] Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.356360 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.358521 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-2fxsh" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.368661 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-8j89k"] Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.474558 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k65np\" (UniqueName: \"kubernetes.io/projected/d51ceaf8-c8f2-4dc0-bbca-35d3562dea95-kube-api-access-k65np\") pod \"interconnect-operator-5bb49f789d-8j89k\" (UID: \"d51ceaf8-c8f2-4dc0-bbca-35d3562dea95\") " pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.575595 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k65np\" (UniqueName: \"kubernetes.io/projected/d51ceaf8-c8f2-4dc0-bbca-35d3562dea95-kube-api-access-k65np\") pod \"interconnect-operator-5bb49f789d-8j89k\" (UID: \"d51ceaf8-c8f2-4dc0-bbca-35d3562dea95\") " pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.597964 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k65np\" (UniqueName: \"kubernetes.io/projected/d51ceaf8-c8f2-4dc0-bbca-35d3562dea95-kube-api-access-k65np\") pod \"interconnect-operator-5bb49f789d-8j89k\" (UID: \"d51ceaf8-c8f2-4dc0-bbca-35d3562dea95\") " pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" Feb 17 00:16:36 crc kubenswrapper[4791]: I0217 00:16:36.674480 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" Feb 17 00:16:39 crc kubenswrapper[4791]: I0217 00:16:39.496723 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-8j89k"] Feb 17 00:16:39 crc kubenswrapper[4791]: I0217 00:16:39.611934 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-d5d58ff4c-lwcwp"] Feb 17 00:16:39 crc kubenswrapper[4791]: W0217 00:16:39.612396 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d24e1c4_bdcf_4ffa_8138_b1fb47410471.slice/crio-c93bc1a6cc722ea7bd8eb36ee6061b674a436fdc2a39e0ef0ae670b4fd379bb4 WatchSource:0}: Error finding container c93bc1a6cc722ea7bd8eb36ee6061b674a436fdc2a39e0ef0ae670b4fd379bb4: Status 404 returned error can't find the container with id c93bc1a6cc722ea7bd8eb36ee6061b674a436fdc2a39e0ef0ae670b4fd379bb4 Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.002367 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" event={"ID":"f73f7b40-6611-465e-ae69-d2f70ce77651","Type":"ContainerStarted","Data":"ad91fe460633ada4a8aca473c899623845f90dd0d80c637ca4a65fff008e362b"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.004583 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" event={"ID":"3b110234-d36d-4ced-a2be-7913bbb84d2a","Type":"ContainerStarted","Data":"ab9e75355363567b2b1284b61535372322efcbdc9ba8d2ff20a1a2b2079a47e3"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.004668 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.005851 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" event={"ID":"d51ceaf8-c8f2-4dc0-bbca-35d3562dea95","Type":"ContainerStarted","Data":"077c74483fe73e7c70896149bb63347cf7a239abea188efe5a2c96f70eef4a9b"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.009158 4791 generic.go:334] "Generic (PLEG): container finished" podID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerID="1611fbf3a08f12bfe96c53f7f199ba9f08cb8d6e0c390dadcd956fd2bbaf7a18" exitCode=0 Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.009230 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" event={"ID":"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9","Type":"ContainerDied","Data":"1611fbf3a08f12bfe96c53f7f199ba9f08cb8d6e0c390dadcd956fd2bbaf7a18"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.011445 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" event={"ID":"b046e97f-6343-4e3f-ae0a-0fb40687d992","Type":"ContainerStarted","Data":"2da4dd72dcd1c35bb0860091ac2cf49de3b25ff54851351b13dd6b08718b9245"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.012775 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" event={"ID":"9d24e1c4-bdcf-4ffa-8138-b1fb47410471","Type":"ContainerStarted","Data":"c93bc1a6cc722ea7bd8eb36ee6061b674a436fdc2a39e0ef0ae670b4fd379bb4"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.014620 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" event={"ID":"307585d5-5ed8-43df-b5d8-977729339610","Type":"ContainerStarted","Data":"106165e9d57803770c869270b914e95e426ac34ea14d8df1561b13b04049e1fc"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.014808 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.016907 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" event={"ID":"8c43370f-07b8-4f84-b716-34af90be5850","Type":"ContainerStarted","Data":"fc7ee0cb4835fd7f4c48103faddc9dcf102c386ddfd16779326abf98fe254ee1"} Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.018108 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.037132 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rw6pj" podStartSLOduration=2.664603978 podStartE2EDuration="15.037098715s" podCreationTimestamp="2026-02-17 00:16:25 +0000 UTC" firstStartedPulling="2026-02-17 00:16:26.787844471 +0000 UTC m=+644.267356998" lastFinishedPulling="2026-02-17 00:16:39.160339208 +0000 UTC m=+656.639851735" observedRunningTime="2026-02-17 00:16:40.032553643 +0000 UTC m=+657.512066170" watchObservedRunningTime="2026-02-17 00:16:40.037098715 +0000 UTC m=+657.516611242" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.079641 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" podStartSLOduration=1.927164462 podStartE2EDuration="14.079625017s" podCreationTimestamp="2026-02-17 00:16:26 +0000 UTC" firstStartedPulling="2026-02-17 00:16:27.009223695 +0000 UTC m=+644.488736212" lastFinishedPulling="2026-02-17 00:16:39.16168424 +0000 UTC m=+656.641196767" observedRunningTime="2026-02-17 00:16:40.075508869 +0000 UTC m=+657.555021396" watchObservedRunningTime="2026-02-17 00:16:40.079625017 +0000 UTC m=+657.559137544" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.091348 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f" podStartSLOduration=2.896557962 podStartE2EDuration="15.091329583s" podCreationTimestamp="2026-02-17 00:16:25 +0000 UTC" firstStartedPulling="2026-02-17 00:16:26.890756425 +0000 UTC m=+644.370268952" lastFinishedPulling="2026-02-17 00:16:39.085528046 +0000 UTC m=+656.565040573" observedRunningTime="2026-02-17 00:16:40.088920638 +0000 UTC m=+657.568433165" watchObservedRunningTime="2026-02-17 00:16:40.091329583 +0000 UTC m=+657.570842110" Feb 17 00:16:40 crc kubenswrapper[4791]: I0217 00:16:40.141999 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-v2lwp" podStartSLOduration=2.012314184 podStartE2EDuration="14.141980339s" podCreationTimestamp="2026-02-17 00:16:26 +0000 UTC" firstStartedPulling="2026-02-17 00:16:27.03854344 +0000 UTC m=+644.518055967" lastFinishedPulling="2026-02-17 00:16:39.168209605 +0000 UTC m=+656.647722122" observedRunningTime="2026-02-17 00:16:40.121545109 +0000 UTC m=+657.601057636" watchObservedRunningTime="2026-02-17 00:16:40.141980339 +0000 UTC m=+657.621492866" Feb 17 00:16:41 crc kubenswrapper[4791]: I0217 00:16:41.025045 4791 generic.go:334] "Generic (PLEG): container finished" podID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerID="66476494ffaa29bc01d1efdd565e3f492af17843ab79206fd7d8c429a6feeb76" exitCode=0 Feb 17 00:16:41 crc kubenswrapper[4791]: I0217 00:16:41.025168 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" event={"ID":"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9","Type":"ContainerDied","Data":"66476494ffaa29bc01d1efdd565e3f492af17843ab79206fd7d8c429a6feeb76"} Feb 17 00:16:41 crc kubenswrapper[4791]: I0217 00:16:41.041336 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5" podStartSLOduration=3.873131483 podStartE2EDuration="16.041310774s" podCreationTimestamp="2026-02-17 00:16:25 +0000 UTC" firstStartedPulling="2026-02-17 00:16:26.917335745 +0000 UTC m=+644.396848272" lastFinishedPulling="2026-02-17 00:16:39.085515036 +0000 UTC m=+656.565027563" observedRunningTime="2026-02-17 00:16:40.143007311 +0000 UTC m=+657.622519828" watchObservedRunningTime="2026-02-17 00:16:41.041310774 +0000 UTC m=+658.520823321" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.637961 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.662379 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2m9m\" (UniqueName: \"kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m\") pod \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.662442 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle\") pod \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.662474 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util\") pod \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\" (UID: \"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9\") " Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.664284 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle" (OuterVolumeSpecName: "bundle") pod "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" (UID: "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.674557 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util" (OuterVolumeSpecName: "util") pod "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" (UID: "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.702399 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m" (OuterVolumeSpecName: "kube-api-access-l2m9m") pod "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" (UID: "ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9"). InnerVolumeSpecName "kube-api-access-l2m9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.763449 4791 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.763483 4791 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-util\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:42 crc kubenswrapper[4791]: I0217 00:16:42.763496 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2m9m\" (UniqueName: \"kubernetes.io/projected/ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9-kube-api-access-l2m9m\") on node \"crc\" DevicePath \"\"" Feb 17 00:16:43 crc kubenswrapper[4791]: I0217 00:16:43.048713 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" event={"ID":"ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9","Type":"ContainerDied","Data":"d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8"} Feb 17 00:16:43 crc kubenswrapper[4791]: I0217 00:16:43.049013 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d63afa2231bbe954c8b70ba459a87aaba8077722339c934771fc84e62117d1c8" Feb 17 00:16:43 crc kubenswrapper[4791]: I0217 00:16:43.048764 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm" Feb 17 00:16:43 crc kubenswrapper[4791]: I0217 00:16:43.051008 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" event={"ID":"9d24e1c4-bdcf-4ffa-8138-b1fb47410471","Type":"ContainerStarted","Data":"cb642cde62bb6e3f4561182cda6bac3b7a21c796661f905fb3d1be66aa26d173"} Feb 17 00:16:43 crc kubenswrapper[4791]: I0217 00:16:43.250231 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-d5d58ff4c-lwcwp" podStartSLOduration=7.179944458 podStartE2EDuration="10.250209924s" podCreationTimestamp="2026-02-17 00:16:33 +0000 UTC" firstStartedPulling="2026-02-17 00:16:39.616396446 +0000 UTC m=+657.095908973" lastFinishedPulling="2026-02-17 00:16:42.686661912 +0000 UTC m=+660.166174439" observedRunningTime="2026-02-17 00:16:43.076866137 +0000 UTC m=+660.556378664" watchObservedRunningTime="2026-02-17 00:16:43.250209924 +0000 UTC m=+660.729722461" Feb 17 00:16:46 crc kubenswrapper[4791]: I0217 00:16:46.752833 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-mk4lp" Feb 17 00:16:49 crc kubenswrapper[4791]: I0217 00:16:49.094334 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" event={"ID":"d51ceaf8-c8f2-4dc0-bbca-35d3562dea95","Type":"ContainerStarted","Data":"7e92305fd7cfeea49afe034fa0bc1eeb102bcf954c1a494f42aa6d1a4406956d"} Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.379065 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-8j89k" podStartSLOduration=13.884736432 podStartE2EDuration="22.379049082s" podCreationTimestamp="2026-02-17 00:16:36 +0000 UTC" firstStartedPulling="2026-02-17 00:16:39.50283168 +0000 UTC m=+656.982344207" lastFinishedPulling="2026-02-17 00:16:47.99714433 +0000 UTC m=+665.476656857" observedRunningTime="2026-02-17 00:16:49.113220379 +0000 UTC m=+666.592732906" watchObservedRunningTime="2026-02-17 00:16:58.379049082 +0000 UTC m=+675.858561609" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.382278 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 17 00:16:58 crc kubenswrapper[4791]: E0217 00:16:58.383151 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="pull" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.383170 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="pull" Feb 17 00:16:58 crc kubenswrapper[4791]: E0217 00:16:58.383179 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="util" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.383186 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="util" Feb 17 00:16:58 crc kubenswrapper[4791]: E0217 00:16:58.383202 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="extract" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.383209 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="extract" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.383490 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9" containerName="extract" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.384216 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.385374 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.387869 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.388438 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.388500 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-9vtp6" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.388750 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.388936 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.389062 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.389184 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.401957 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.408383 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471234 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471277 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471297 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471319 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471427 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471576 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471659 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471726 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471748 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471769 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471793 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471879 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.471945 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/335ade17-e7c1-487c-9e12-ad3d0d3610b0-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.472020 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.472049 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573310 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573362 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573390 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573604 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573623 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573649 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573675 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573700 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573717 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573734 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573756 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573774 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573795 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/335ade17-e7c1-487c-9e12-ad3d0d3610b0-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573824 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.573842 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.574195 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.574195 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.574309 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.574451 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.574781 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.575169 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.575186 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.575922 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.583337 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.584179 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/335ade17-e7c1-487c-9e12-ad3d0d3610b0-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.584819 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.585021 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.587851 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.591779 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.599741 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/335ade17-e7c1-487c-9e12-ad3d0d3610b0-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"335ade17-e7c1-487c-9e12-ad3d0d3610b0\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:58 crc kubenswrapper[4791]: I0217 00:16:58.701520 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:16:59 crc kubenswrapper[4791]: I0217 00:16:59.015077 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 17 00:16:59 crc kubenswrapper[4791]: I0217 00:16:59.153832 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"335ade17-e7c1-487c-9e12-ad3d0d3610b0","Type":"ContainerStarted","Data":"88d6b483b0014d6e616311beef6b916ce80ae994c03f1b13e031beed251561a5"} Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.120057 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm"] Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.121314 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.124474 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.124534 4791 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-lcc8z" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.124683 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.132786 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm"] Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.177914 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.178206 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qts\" (UniqueName: \"kubernetes.io/projected/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-kube-api-access-s9qts\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.279777 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.279835 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qts\" (UniqueName: \"kubernetes.io/projected/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-kube-api-access-s9qts\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.280951 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.302491 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qts\" (UniqueName: \"kubernetes.io/projected/4bc89f8e-fb5f-4ba2-826a-d93c8a11383c-kube-api-access-s9qts\") pod \"cert-manager-operator-controller-manager-5586865c96-wbhjm\" (UID: \"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.442620 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" Feb 17 00:17:04 crc kubenswrapper[4791]: I0217 00:17:04.694845 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm"] Feb 17 00:17:04 crc kubenswrapper[4791]: W0217 00:17:04.707541 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bc89f8e_fb5f_4ba2_826a_d93c8a11383c.slice/crio-3103e33386c43114e32dea6a09ec6c279543d07e1076b472e821c396a127b54e WatchSource:0}: Error finding container 3103e33386c43114e32dea6a09ec6c279543d07e1076b472e821c396a127b54e: Status 404 returned error can't find the container with id 3103e33386c43114e32dea6a09ec6c279543d07e1076b472e821c396a127b54e Feb 17 00:17:05 crc kubenswrapper[4791]: I0217 00:17:05.191963 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" event={"ID":"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c","Type":"ContainerStarted","Data":"3103e33386c43114e32dea6a09ec6c279543d07e1076b472e821c396a127b54e"} Feb 17 00:17:15 crc kubenswrapper[4791]: E0217 00:17:15.895979 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Feb 17 00:17:15 crc kubenswrapper[4791]: E0217 00:17:15.896750 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(335ade17-e7c1-487c-9e12-ad3d0d3610b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 00:17:15 crc kubenswrapper[4791]: E0217 00:17:15.898024 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" Feb 17 00:17:16 crc kubenswrapper[4791]: I0217 00:17:16.268312 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" event={"ID":"4bc89f8e-fb5f-4ba2-826a-d93c8a11383c","Type":"ContainerStarted","Data":"efecf6143e91b35e4b5e6220794aa692f62f75c8dd9fa04143d3e2abad00b483"} Feb 17 00:17:16 crc kubenswrapper[4791]: E0217 00:17:16.269604 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" Feb 17 00:17:16 crc kubenswrapper[4791]: I0217 00:17:16.328870 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-wbhjm" podStartSLOduration=1.225165164 podStartE2EDuration="12.328849377s" podCreationTimestamp="2026-02-17 00:17:04 +0000 UTC" firstStartedPulling="2026-02-17 00:17:04.711346958 +0000 UTC m=+682.190859515" lastFinishedPulling="2026-02-17 00:17:15.815031181 +0000 UTC m=+693.294543728" observedRunningTime="2026-02-17 00:17:16.32605878 +0000 UTC m=+693.805571307" watchObservedRunningTime="2026-02-17 00:17:16.328849377 +0000 UTC m=+693.808361904" Feb 17 00:17:16 crc kubenswrapper[4791]: I0217 00:17:16.407728 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 17 00:17:16 crc kubenswrapper[4791]: I0217 00:17:16.442057 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 17 00:17:17 crc kubenswrapper[4791]: E0217 00:17:17.274812 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" Feb 17 00:17:18 crc kubenswrapper[4791]: E0217 00:17:18.278633 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.686101 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-42zhs"] Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.687553 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.689943 4791 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-ppxz9" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.690403 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.690546 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.695000 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-42zhs"] Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.780532 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4hdw\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-kube-api-access-p4hdw\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.780603 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.882051 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.882214 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4hdw\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-kube-api-access-p4hdw\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.911377 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4hdw\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-kube-api-access-p4hdw\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:18 crc kubenswrapper[4791]: I0217 00:17:18.923231 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b26c415-6a42-4bda-abbd-cf394bc94043-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-42zhs\" (UID: \"4b26c415-6a42-4bda-abbd-cf394bc94043\") " pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:19 crc kubenswrapper[4791]: I0217 00:17:19.005153 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:19 crc kubenswrapper[4791]: I0217 00:17:19.252623 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-42zhs"] Feb 17 00:17:19 crc kubenswrapper[4791]: I0217 00:17:19.284540 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" event={"ID":"4b26c415-6a42-4bda-abbd-cf394bc94043","Type":"ContainerStarted","Data":"cb83f072908cc41e0fd023cebdc45b8fb2fa757a086a0de47168b6d4c4b95a54"} Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.235302 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-l9798"] Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.236878 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.239287 4791 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-88mxr" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.264345 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-l9798"] Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.333411 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll4z9\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-kube-api-access-ll4z9\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.333697 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.434892 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll4z9\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-kube-api-access-ll4z9\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.434960 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.455654 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll4z9\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-kube-api-access-ll4z9\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.455908 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca3c38d-0dd8-4457-854a-b392ba180087-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-l9798\" (UID: \"aca3c38d-0dd8-4457-854a-b392ba180087\") " pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:22 crc kubenswrapper[4791]: I0217 00:17:22.566578 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" Feb 17 00:17:24 crc kubenswrapper[4791]: I0217 00:17:24.310907 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-l9798"] Feb 17 00:17:24 crc kubenswrapper[4791]: W0217 00:17:24.321779 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaca3c38d_0dd8_4457_854a_b392ba180087.slice/crio-88f2cf39f5fa160c62d89366d3500d68a88310744e285ff20e90266503f04362 WatchSource:0}: Error finding container 88f2cf39f5fa160c62d89366d3500d68a88310744e285ff20e90266503f04362: Status 404 returned error can't find the container with id 88f2cf39f5fa160c62d89366d3500d68a88310744e285ff20e90266503f04362 Feb 17 00:17:24 crc kubenswrapper[4791]: I0217 00:17:24.334655 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" event={"ID":"aca3c38d-0dd8-4457-854a-b392ba180087","Type":"ContainerStarted","Data":"88f2cf39f5fa160c62d89366d3500d68a88310744e285ff20e90266503f04362"} Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.343995 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" event={"ID":"aca3c38d-0dd8-4457-854a-b392ba180087","Type":"ContainerStarted","Data":"32a101c5108440b5174b1a25337b522ac0e2f252243397f619bc2c8bf52eca97"} Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.345834 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" event={"ID":"4b26c415-6a42-4bda-abbd-cf394bc94043","Type":"ContainerStarted","Data":"2599d909254c9145c013a9f8b1cccad4b26f21fb9eda93dcc51e01d72be7a590"} Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.345985 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.467968 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-l9798" podStartSLOduration=3.467950891 podStartE2EDuration="3.467950891s" podCreationTimestamp="2026-02-17 00:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:17:25.39503057 +0000 UTC m=+702.874543097" watchObservedRunningTime="2026-02-17 00:17:25.467950891 +0000 UTC m=+702.947463428" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.469828 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" podStartSLOduration=2.572843878 podStartE2EDuration="7.469818661s" podCreationTimestamp="2026-02-17 00:17:18 +0000 UTC" firstStartedPulling="2026-02-17 00:17:19.259602326 +0000 UTC m=+696.739114853" lastFinishedPulling="2026-02-17 00:17:24.156577109 +0000 UTC m=+701.636089636" observedRunningTime="2026-02-17 00:17:25.465489004 +0000 UTC m=+702.945001541" watchObservedRunningTime="2026-02-17 00:17:25.469818661 +0000 UTC m=+702.949331208" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.509662 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.510772 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.514789 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.514933 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.515278 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.515805 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.548976 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591128 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591362 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591438 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591505 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591569 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591692 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591767 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6ln\" (UniqueName: \"kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591845 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591915 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.591989 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.592085 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.592199 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.693524 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.693754 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.694301 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.694967 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695342 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695678 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695797 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6ln\" (UniqueName: \"kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696122 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696398 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696792 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696927 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.697325 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.694922 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696364 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695637 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696762 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.694253 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.696888 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695769 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.695304 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.698177 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.700759 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.714684 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6ln\" (UniqueName: \"kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.716055 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:25 crc kubenswrapper[4791]: I0217 00:17:25.823898 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:26 crc kubenswrapper[4791]: I0217 00:17:26.088455 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:26 crc kubenswrapper[4791]: I0217 00:17:26.353794 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"d8695c88-2448-4593-8029-3ce49d07ca00","Type":"ContainerStarted","Data":"f8c7e3abade9be907d19dcea3944ab13d63c479cb28ae61ecc17366d326050c2"} Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.009721 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-42zhs" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.737008 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-9dsmn"] Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.737913 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.740716 4791 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-g6pq2" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.749085 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9dsmn"] Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.850161 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhp7c\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-kube-api-access-bhp7c\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.850222 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-bound-sa-token\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.951856 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhp7c\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-kube-api-access-bhp7c\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.951911 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-bound-sa-token\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.979622 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-bound-sa-token\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:29 crc kubenswrapper[4791]: I0217 00:17:29.979878 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhp7c\" (UniqueName: \"kubernetes.io/projected/bf759390-4034-42c9-811b-531aeabd3ed6-kube-api-access-bhp7c\") pod \"cert-manager-545d4d4674-9dsmn\" (UID: \"bf759390-4034-42c9-811b-531aeabd3ed6\") " pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:30 crc kubenswrapper[4791]: I0217 00:17:30.066614 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9dsmn" Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.152179 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9dsmn"] Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.396776 4791 generic.go:334] "Generic (PLEG): container finished" podID="d8695c88-2448-4593-8029-3ce49d07ca00" containerID="b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8" exitCode=0 Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.396856 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"d8695c88-2448-4593-8029-3ce49d07ca00","Type":"ContainerDied","Data":"b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8"} Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.398393 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9dsmn" event={"ID":"bf759390-4034-42c9-811b-531aeabd3ed6","Type":"ContainerStarted","Data":"eeceee21040b1bf28580fb56818482346ebf369b17eeb2321be2018b1d2a00c8"} Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.398439 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9dsmn" event={"ID":"bf759390-4034-42c9-811b-531aeabd3ed6","Type":"ContainerStarted","Data":"1f8d652602d661e7a55fb3fe99a84922d0cad37c0465c53d81b792f11279f06d"} Feb 17 00:17:33 crc kubenswrapper[4791]: I0217 00:17:33.446400 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-9dsmn" podStartSLOduration=4.446379352 podStartE2EDuration="4.446379352s" podCreationTimestamp="2026-02-17 00:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:17:33.443902783 +0000 UTC m=+710.923415320" watchObservedRunningTime="2026-02-17 00:17:33.446379352 +0000 UTC m=+710.925891889" Feb 17 00:17:34 crc kubenswrapper[4791]: I0217 00:17:34.407460 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"d8695c88-2448-4593-8029-3ce49d07ca00","Type":"ContainerStarted","Data":"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74"} Feb 17 00:17:34 crc kubenswrapper[4791]: I0217 00:17:34.410210 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"335ade17-e7c1-487c-9e12-ad3d0d3610b0","Type":"ContainerStarted","Data":"c4d4033b577bd6d027c697301176c69c0a46b27d66a578e3f38c80be97b97a25"} Feb 17 00:17:34 crc kubenswrapper[4791]: I0217 00:17:34.443013 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=2.754913236 podStartE2EDuration="9.442988671s" podCreationTimestamp="2026-02-17 00:17:25 +0000 UTC" firstStartedPulling="2026-02-17 00:17:26.094270829 +0000 UTC m=+703.573783376" lastFinishedPulling="2026-02-17 00:17:32.782346244 +0000 UTC m=+710.261858811" observedRunningTime="2026-02-17 00:17:34.439925615 +0000 UTC m=+711.919438152" watchObservedRunningTime="2026-02-17 00:17:34.442988671 +0000 UTC m=+711.922501238" Feb 17 00:17:35 crc kubenswrapper[4791]: I0217 00:17:35.557846 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:36 crc kubenswrapper[4791]: I0217 00:17:36.436084 4791 generic.go:334] "Generic (PLEG): container finished" podID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" containerID="c4d4033b577bd6d027c697301176c69c0a46b27d66a578e3f38c80be97b97a25" exitCode=0 Feb 17 00:17:36 crc kubenswrapper[4791]: I0217 00:17:36.436196 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"335ade17-e7c1-487c-9e12-ad3d0d3610b0","Type":"ContainerDied","Data":"c4d4033b577bd6d027c697301176c69c0a46b27d66a578e3f38c80be97b97a25"} Feb 17 00:17:36 crc kubenswrapper[4791]: I0217 00:17:36.436904 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="docker-build" containerID="cri-o://9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74" gracePeriod=30 Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.259848 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.261727 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.264878 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.265574 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.265869 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.288995 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.453800 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.453881 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.453931 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454024 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454240 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454333 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454371 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454412 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl8nj\" (UniqueName: \"kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454530 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454590 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454783 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.454857 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556173 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556260 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556327 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556362 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556433 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556367 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556472 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556532 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556579 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556616 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556655 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl8nj\" (UniqueName: \"kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556693 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556725 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556865 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.556888 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.557246 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.557426 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.557605 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.557796 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.558239 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.558566 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.562308 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.562449 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.573569 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl8nj\" (UniqueName: \"kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj\") pod \"service-telemetry-operator-2-build\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:37 crc kubenswrapper[4791]: I0217 00:17:37.580409 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:40 crc kubenswrapper[4791]: W0217 00:17:40.534054 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a3e8ccc_76aa_44db_b5bb_f4043e185f4f.slice/crio-000d39e60be5101bdcfddeeab74e9e6f53543b1e253e3e0112987c261000a9ce WatchSource:0}: Error finding container 000d39e60be5101bdcfddeeab74e9e6f53543b1e253e3e0112987c261000a9ce: Status 404 returned error can't find the container with id 000d39e60be5101bdcfddeeab74e9e6f53543b1e253e3e0112987c261000a9ce Feb 17 00:17:40 crc kubenswrapper[4791]: I0217 00:17:40.539083 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:41 crc kubenswrapper[4791]: I0217 00:17:41.478566 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f","Type":"ContainerStarted","Data":"000d39e60be5101bdcfddeeab74e9e6f53543b1e253e3e0112987c261000a9ce"} Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.056442 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_d8695c88-2448-4593-8029-3ce49d07ca00/docker-build/0.log" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.058012 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.243823 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd6ln\" (UniqueName: \"kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244562 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244662 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244712 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244833 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244899 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.244967 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245037 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245154 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245215 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245254 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245306 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles\") pod \"d8695c88-2448-4593-8029-3ce49d07ca00\" (UID: \"d8695c88-2448-4593-8029-3ce49d07ca00\") " Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245398 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245484 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245580 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245521 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.245769 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246229 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246270 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8695c88-2448-4593-8029-3ce49d07ca00-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246298 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246322 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246350 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.246740 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.247100 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.247595 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.248096 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.251760 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.252495 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.252839 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln" (OuterVolumeSpecName: "kube-api-access-kd6ln") pod "d8695c88-2448-4593-8029-3ce49d07ca00" (UID: "d8695c88-2448-4593-8029-3ce49d07ca00"). InnerVolumeSpecName "kube-api-access-kd6ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348024 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348094 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348187 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/d8695c88-2448-4593-8029-3ce49d07ca00-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348211 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8695c88-2448-4593-8029-3ce49d07ca00-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348229 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348246 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd6ln\" (UniqueName: \"kubernetes.io/projected/d8695c88-2448-4593-8029-3ce49d07ca00-kube-api-access-kd6ln\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.348265 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8695c88-2448-4593-8029-3ce49d07ca00-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.504803 4791 generic.go:334] "Generic (PLEG): container finished" podID="335ade17-e7c1-487c-9e12-ad3d0d3610b0" containerID="9e94e4637793dfe889be430349ee205bb9c145a8da231847d4d11b6bbde20812" exitCode=0 Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.504857 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"335ade17-e7c1-487c-9e12-ad3d0d3610b0","Type":"ContainerDied","Data":"9e94e4637793dfe889be430349ee205bb9c145a8da231847d4d11b6bbde20812"} Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.508484 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_d8695c88-2448-4593-8029-3ce49d07ca00/docker-build/0.log" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.508986 4791 generic.go:334] "Generic (PLEG): container finished" podID="d8695c88-2448-4593-8029-3ce49d07ca00" containerID="9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74" exitCode=1 Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.509069 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"d8695c88-2448-4593-8029-3ce49d07ca00","Type":"ContainerDied","Data":"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74"} Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.509104 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"d8695c88-2448-4593-8029-3ce49d07ca00","Type":"ContainerDied","Data":"f8c7e3abade9be907d19dcea3944ab13d63c479cb28ae61ecc17366d326050c2"} Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.509177 4791 scope.go:117] "RemoveContainer" containerID="9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.509350 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.513098 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f","Type":"ContainerStarted","Data":"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7"} Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.588487 4791 scope.go:117] "RemoveContainer" containerID="b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.588835 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.596596 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.624208 4791 scope.go:117] "RemoveContainer" containerID="9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74" Feb 17 00:17:44 crc kubenswrapper[4791]: E0217 00:17:44.624693 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74\": container with ID starting with 9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74 not found: ID does not exist" containerID="9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.624727 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74"} err="failed to get container status \"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74\": rpc error: code = NotFound desc = could not find container \"9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74\": container with ID starting with 9ae2c8e111457326ead8306b8de246dbef63976b944026f31b972d279ea4fd74 not found: ID does not exist" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.624746 4791 scope.go:117] "RemoveContainer" containerID="b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8" Feb 17 00:17:44 crc kubenswrapper[4791]: E0217 00:17:44.625168 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8\": container with ID starting with b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8 not found: ID does not exist" containerID="b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8" Feb 17 00:17:44 crc kubenswrapper[4791]: I0217 00:17:44.625190 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8"} err="failed to get container status \"b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8\": rpc error: code = NotFound desc = could not find container \"b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8\": container with ID starting with b93839fd038cc9cd6db52810ac7b3d237a1960eebdd63f2c5391188ab51bd3e8 not found: ID does not exist" Feb 17 00:17:44 crc kubenswrapper[4791]: E0217 00:17:44.669727 4791 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5079511847538378697, SKID=, AKID=AD:CC:45:B9:DF:E8:B7:57:BC:FE:80:6F:F9:92:EA:D4:BA:33:46:63 failed: x509: certificate signed by unknown authority" Feb 17 00:17:45 crc kubenswrapper[4791]: I0217 00:17:45.233945 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" path="/var/lib/kubelet/pods/d8695c88-2448-4593-8029-3ce49d07ca00/volumes" Feb 17 00:17:45 crc kubenswrapper[4791]: I0217 00:17:45.523624 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"335ade17-e7c1-487c-9e12-ad3d0d3610b0","Type":"ContainerStarted","Data":"f99d0ba12b6f516ce592793e9ba34f5ae8f85a650bd90b8412798ee4cdcb4c0e"} Feb 17 00:17:45 crc kubenswrapper[4791]: I0217 00:17:45.524196 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:17:45 crc kubenswrapper[4791]: I0217 00:17:45.574420 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=13.329613495 podStartE2EDuration="47.574403286s" podCreationTimestamp="2026-02-17 00:16:58 +0000 UTC" firstStartedPulling="2026-02-17 00:16:59.020677557 +0000 UTC m=+676.500190084" lastFinishedPulling="2026-02-17 00:17:33.265467348 +0000 UTC m=+710.744979875" observedRunningTime="2026-02-17 00:17:45.573735645 +0000 UTC m=+723.053248172" watchObservedRunningTime="2026-02-17 00:17:45.574403286 +0000 UTC m=+723.053915813" Feb 17 00:17:45 crc kubenswrapper[4791]: I0217 00:17:45.705263 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:46 crc kubenswrapper[4791]: I0217 00:17:46.529959 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-2-build" podUID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" containerName="git-clone" containerID="cri-o://61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7" gracePeriod=30 Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.007663 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_6a3e8ccc-76aa-44db-b5bb-f4043e185f4f/git-clone/0.log" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.008003 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099267 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099560 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl8nj\" (UniqueName: \"kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099644 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099714 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099820 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099918 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100018 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100182 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100261 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.099746 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100423 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100500 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100565 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets\") pod \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\" (UID: \"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f\") " Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100645 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100883 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.100943 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.107603 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.107630 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.109758 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.109847 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.111627 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.111726 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.111811 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.112251 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.113779 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj" (OuterVolumeSpecName: "kube-api-access-gl8nj") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "kube-api-access-gl8nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.117121 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" (UID: "6a3e8ccc-76aa-44db-b5bb-f4043e185f4f"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.201512 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.201755 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.201837 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.201915 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202019 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202095 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202190 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202263 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202343 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl8nj\" (UniqueName: \"kubernetes.io/projected/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-kube-api-access-gl8nj\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.202407 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536085 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_6a3e8ccc-76aa-44db-b5bb-f4043e185f4f/git-clone/0.log" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536152 4791 generic.go:334] "Generic (PLEG): container finished" podID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" containerID="61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7" exitCode=1 Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536185 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f","Type":"ContainerDied","Data":"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7"} Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536218 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"6a3e8ccc-76aa-44db-b5bb-f4043e185f4f","Type":"ContainerDied","Data":"000d39e60be5101bdcfddeeab74e9e6f53543b1e253e3e0112987c261000a9ce"} Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536228 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.536237 4791 scope.go:117] "RemoveContainer" containerID="61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.558680 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.559059 4791 scope.go:117] "RemoveContainer" containerID="61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7" Feb 17 00:17:47 crc kubenswrapper[4791]: E0217 00:17:47.559747 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7\": container with ID starting with 61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7 not found: ID does not exist" containerID="61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.559776 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7"} err="failed to get container status \"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7\": rpc error: code = NotFound desc = could not find container \"61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7\": container with ID starting with 61805709983d6546ba064b2eea228c45d5e8020a686cfb542d2d233b9b75f0a7 not found: ID does not exist" Feb 17 00:17:47 crc kubenswrapper[4791]: I0217 00:17:47.564753 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 17 00:17:49 crc kubenswrapper[4791]: I0217 00:17:49.228729 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" path="/var/lib/kubelet/pods/6a3e8ccc-76aa-44db-b5bb-f4043e185f4f/volumes" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.115220 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:17:57 crc kubenswrapper[4791]: E0217 00:17:57.118050 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" containerName="git-clone" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.118308 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" containerName="git-clone" Feb 17 00:17:57 crc kubenswrapper[4791]: E0217 00:17:57.118476 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="docker-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.118632 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="docker-build" Feb 17 00:17:57 crc kubenswrapper[4791]: E0217 00:17:57.118799 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="manage-dockerfile" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.118953 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="manage-dockerfile" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.119385 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3e8ccc-76aa-44db-b5bb-f4043e185f4f" containerName="git-clone" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.119584 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8695c88-2448-4593-8029-3ce49d07ca00" containerName="docker-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.121676 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.126434 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-sys-config" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.127005 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-global-ca" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.127473 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-ca" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.130259 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.140803 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.140847 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.140889 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.140958 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.140998 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141012 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141040 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp9mn\" (UniqueName: \"kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141072 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141092 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141133 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141158 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.141172 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.148737 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.241917 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.242637 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.242890 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.243829 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.243978 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp9mn\" (UniqueName: \"kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244437 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244625 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244747 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244846 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244944 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245045 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245164 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245275 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245405 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245626 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245470 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245923 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.244996 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.246267 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.245420 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.247276 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.260985 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.260985 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.265169 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp9mn\" (UniqueName: \"kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn\") pod \"service-telemetry-operator-3-build\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.440998 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:17:57 crc kubenswrapper[4791]: I0217 00:17:57.885520 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:17:57 crc kubenswrapper[4791]: W0217 00:17:57.893619 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod295671d5_4684_438c_8761_4a5d0eb6a9c5.slice/crio-315ebbc182138ebf30ba7132dabab3344781917088dda50717dd8353292508b8 WatchSource:0}: Error finding container 315ebbc182138ebf30ba7132dabab3344781917088dda50717dd8353292508b8: Status 404 returned error can't find the container with id 315ebbc182138ebf30ba7132dabab3344781917088dda50717dd8353292508b8 Feb 17 00:17:58 crc kubenswrapper[4791]: I0217 00:17:58.618511 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"295671d5-4684-438c-8761-4a5d0eb6a9c5","Type":"ContainerStarted","Data":"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9"} Feb 17 00:17:58 crc kubenswrapper[4791]: I0217 00:17:58.618841 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"295671d5-4684-438c-8761-4a5d0eb6a9c5","Type":"ContainerStarted","Data":"315ebbc182138ebf30ba7132dabab3344781917088dda50717dd8353292508b8"} Feb 17 00:17:58 crc kubenswrapper[4791]: E0217 00:17:58.679670 4791 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5079511847538378697, SKID=, AKID=AD:CC:45:B9:DF:E8:B7:57:BC:FE:80:6F:F9:92:EA:D4:BA:33:46:63 failed: x509: certificate signed by unknown authority" Feb 17 00:17:59 crc kubenswrapper[4791]: I0217 00:17:59.374858 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Feb 17 00:17:59 crc kubenswrapper[4791]: I0217 00:17:59.722300 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:18:00 crc kubenswrapper[4791]: I0217 00:18:00.633606 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-3-build" podUID="295671d5-4684-438c-8761-4a5d0eb6a9c5" containerName="git-clone" containerID="cri-o://e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9" gracePeriod=30 Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.016055 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_295671d5-4684-438c-8761-4a5d0eb6a9c5/git-clone/0.log" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.016347 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112682 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112736 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112770 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112815 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112862 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp9mn\" (UniqueName: \"kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113382 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.112863 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113375 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113421 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113460 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113450 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113510 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113556 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113594 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113631 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir\") pod \"295671d5-4684-438c-8761-4a5d0eb6a9c5\" (UID: \"295671d5-4684-438c-8761-4a5d0eb6a9c5\") " Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113747 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113770 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.113836 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114130 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114150 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114158 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114167 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114175 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114184 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114214 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114256 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.114434 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.120297 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.120314 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn" (OuterVolumeSpecName: "kube-api-access-rp9mn") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "kube-api-access-rp9mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.120332 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "295671d5-4684-438c-8761-4a5d0eb6a9c5" (UID: "295671d5-4684-438c-8761-4a5d0eb6a9c5"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215422 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/295671d5-4684-438c-8761-4a5d0eb6a9c5-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215459 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215473 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp9mn\" (UniqueName: \"kubernetes.io/projected/295671d5-4684-438c-8761-4a5d0eb6a9c5-kube-api-access-rp9mn\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215484 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/295671d5-4684-438c-8761-4a5d0eb6a9c5-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215496 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.215507 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/295671d5-4684-438c-8761-4a5d0eb6a9c5-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643258 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_295671d5-4684-438c-8761-4a5d0eb6a9c5/git-clone/0.log" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643364 4791 generic.go:334] "Generic (PLEG): container finished" podID="295671d5-4684-438c-8761-4a5d0eb6a9c5" containerID="e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9" exitCode=1 Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643405 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"295671d5-4684-438c-8761-4a5d0eb6a9c5","Type":"ContainerDied","Data":"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9"} Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643447 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"295671d5-4684-438c-8761-4a5d0eb6a9c5","Type":"ContainerDied","Data":"315ebbc182138ebf30ba7132dabab3344781917088dda50717dd8353292508b8"} Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643472 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.643477 4791 scope.go:117] "RemoveContainer" containerID="e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.673280 4791 scope.go:117] "RemoveContainer" containerID="e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9" Feb 17 00:18:01 crc kubenswrapper[4791]: E0217 00:18:01.674144 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9\": container with ID starting with e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9 not found: ID does not exist" containerID="e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.674182 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9"} err="failed to get container status \"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9\": rpc error: code = NotFound desc = could not find container \"e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9\": container with ID starting with e48b696e6befdef017dc2c92a69c56769e56ed142e271411c3e28861708b11f9 not found: ID does not exist" Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.676228 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:18:01 crc kubenswrapper[4791]: I0217 00:18:01.685863 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 17 00:18:03 crc kubenswrapper[4791]: I0217 00:18:03.237664 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295671d5-4684-438c-8761-4a5d0eb6a9c5" path="/var/lib/kubelet/pods/295671d5-4684-438c-8761-4a5d0eb6a9c5/volumes" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.180650 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:11 crc kubenswrapper[4791]: E0217 00:18:11.181456 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295671d5-4684-438c-8761-4a5d0eb6a9c5" containerName="git-clone" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.181471 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="295671d5-4684-438c-8761-4a5d0eb6a9c5" containerName="git-clone" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.181587 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="295671d5-4684-438c-8761-4a5d0eb6a9c5" containerName="git-clone" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.182427 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.184327 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.184623 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-global-ca" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.184748 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-sys-config" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.184861 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-ca" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.203818 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257283 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257323 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbwcd\" (UniqueName: \"kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257344 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257363 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257406 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257433 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257490 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257519 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257540 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257568 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257603 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.257624 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358437 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358555 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358622 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358691 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358742 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbwcd\" (UniqueName: \"kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358792 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358842 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358936 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358947 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.358990 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359184 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359184 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359207 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359281 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359297 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359930 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.359961 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.360001 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.360162 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.360350 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.360828 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.365789 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.366770 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.386585 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbwcd\" (UniqueName: \"kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd\") pod \"service-telemetry-operator-4-build\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.510268 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:11 crc kubenswrapper[4791]: I0217 00:18:11.844272 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:12 crc kubenswrapper[4791]: I0217 00:18:12.720688 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8","Type":"ContainerStarted","Data":"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b"} Feb 17 00:18:12 crc kubenswrapper[4791]: I0217 00:18:12.721187 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8","Type":"ContainerStarted","Data":"642c451b43608526730ae9299d52d48dc5d8c9d9ba22f3ab1c76d4a4069d373a"} Feb 17 00:18:12 crc kubenswrapper[4791]: E0217 00:18:12.803949 4791 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=5079511847538378697, SKID=, AKID=AD:CC:45:B9:DF:E8:B7:57:BC:FE:80:6F:F9:92:EA:D4:BA:33:46:63 failed: x509: certificate signed by unknown authority" Feb 17 00:18:13 crc kubenswrapper[4791]: I0217 00:18:13.843582 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:14 crc kubenswrapper[4791]: I0217 00:18:14.737653 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-4-build" podUID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" containerName="git-clone" containerID="cri-o://df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b" gracePeriod=30 Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.181825 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_e01fe51a-e7fc-45d8-80f9-8a6c767d97f8/git-clone/0.log" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.182379 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317697 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317744 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317770 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbwcd\" (UniqueName: \"kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317825 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317869 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.317899 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.318322 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.318943 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.318994 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319054 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319080 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319137 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319166 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319190 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319231 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs\") pod \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\" (UID: \"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8\") " Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319435 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319675 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319699 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319711 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319724 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319681 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319717 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319758 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.319893 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.320122 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.325880 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.326563 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd" (OuterVolumeSpecName: "kube-api-access-xbwcd") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "kube-api-access-xbwcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.330605 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" (UID: "e01fe51a-e7fc-45d8-80f9-8a6c767d97f8"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421154 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421209 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421229 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421249 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421270 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421289 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421306 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.421323 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbwcd\" (UniqueName: \"kubernetes.io/projected/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8-kube-api-access-xbwcd\") on node \"crc\" DevicePath \"\"" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747665 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_e01fe51a-e7fc-45d8-80f9-8a6c767d97f8/git-clone/0.log" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747751 4791 generic.go:334] "Generic (PLEG): container finished" podID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" containerID="df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b" exitCode=1 Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747804 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8","Type":"ContainerDied","Data":"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b"} Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747886 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"e01fe51a-e7fc-45d8-80f9-8a6c767d97f8","Type":"ContainerDied","Data":"642c451b43608526730ae9299d52d48dc5d8c9d9ba22f3ab1c76d4a4069d373a"} Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747904 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.747923 4791 scope.go:117] "RemoveContainer" containerID="df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.771954 4791 scope.go:117] "RemoveContainer" containerID="df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b" Feb 17 00:18:15 crc kubenswrapper[4791]: E0217 00:18:15.772545 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b\": container with ID starting with df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b not found: ID does not exist" containerID="df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.772585 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b"} err="failed to get container status \"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b\": rpc error: code = NotFound desc = could not find container \"df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b\": container with ID starting with df64ec6c138fe5f8c7495a2359d20ea8bdedd5d71526fe5d6586efe9a636c57b not found: ID does not exist" Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.803620 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:15 crc kubenswrapper[4791]: I0217 00:18:15.828433 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 17 00:18:16 crc kubenswrapper[4791]: I0217 00:18:16.623067 4791 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 00:18:17 crc kubenswrapper[4791]: I0217 00:18:17.234700 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" path="/var/lib/kubelet/pods/e01fe51a-e7fc-45d8-80f9-8a6c767d97f8/volumes" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.289609 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Feb 17 00:18:25 crc kubenswrapper[4791]: E0217 00:18:25.290655 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" containerName="git-clone" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.290671 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" containerName="git-clone" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.290801 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e01fe51a-e7fc-45d8-80f9-8a6c767d97f8" containerName="git-clone" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.291620 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.295085 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.295122 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-global-ca" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.295691 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-sys-config" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.306241 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-ca" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.334831 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.468884 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.468968 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469036 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469100 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469217 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d889h\" (UniqueName: \"kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469379 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469445 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469495 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469580 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469791 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469903 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.469935 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570852 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570910 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570929 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570956 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570976 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.570996 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571020 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571035 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571056 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571076 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571095 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d889h\" (UniqueName: \"kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571164 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571431 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571443 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.571660 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.572024 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.572160 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.572627 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.572863 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.573363 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.574067 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.578299 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.578502 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.603148 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d889h\" (UniqueName: \"kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h\") pod \"service-telemetry-operator-5-build\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:25 crc kubenswrapper[4791]: I0217 00:18:25.631387 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:18:26 crc kubenswrapper[4791]: I0217 00:18:26.133223 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Feb 17 00:18:26 crc kubenswrapper[4791]: I0217 00:18:26.833176 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerStarted","Data":"0bcb2883fbf6e23deec884d193875f95422093db924f3ffdb529bd81606706b9"} Feb 17 00:18:26 crc kubenswrapper[4791]: I0217 00:18:26.833579 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerStarted","Data":"6360b057a63c6515bb6e37784e1d42fc2bc72fdee92da431ce8d64435bcc1761"} Feb 17 00:18:36 crc kubenswrapper[4791]: I0217 00:18:36.911412 4791 generic.go:334] "Generic (PLEG): container finished" podID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerID="0bcb2883fbf6e23deec884d193875f95422093db924f3ffdb529bd81606706b9" exitCode=0 Feb 17 00:18:36 crc kubenswrapper[4791]: I0217 00:18:36.911544 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerDied","Data":"0bcb2883fbf6e23deec884d193875f95422093db924f3ffdb529bd81606706b9"} Feb 17 00:18:37 crc kubenswrapper[4791]: I0217 00:18:37.922992 4791 generic.go:334] "Generic (PLEG): container finished" podID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerID="9ec220b676197175b00a9c30efe79e7cc1a1947917c35b253116c4d516d0c7f3" exitCode=0 Feb 17 00:18:37 crc kubenswrapper[4791]: I0217 00:18:37.923050 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerDied","Data":"9ec220b676197175b00a9c30efe79e7cc1a1947917c35b253116c4d516d0c7f3"} Feb 17 00:18:38 crc kubenswrapper[4791]: I0217 00:18:38.040862 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_f00d7586-332c-485f-b171-5b3f4f7a0728/manage-dockerfile/0.log" Feb 17 00:18:38 crc kubenswrapper[4791]: I0217 00:18:38.934887 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerStarted","Data":"140e21b65fbe14e821c2dad12e49571665b14cf6c8f81e58c83c173b7ce95626"} Feb 17 00:18:38 crc kubenswrapper[4791]: I0217 00:18:38.978535 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-5-build" podStartSLOduration=13.978505948 podStartE2EDuration="13.978505948s" podCreationTimestamp="2026-02-17 00:18:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:18:38.974158531 +0000 UTC m=+776.453671128" watchObservedRunningTime="2026-02-17 00:18:38.978505948 +0000 UTC m=+776.458018485" Feb 17 00:18:54 crc kubenswrapper[4791]: I0217 00:18:54.973332 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:18:54 crc kubenswrapper[4791]: I0217 00:18:54.974155 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:19:24 crc kubenswrapper[4791]: I0217 00:19:24.973427 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:19:24 crc kubenswrapper[4791]: I0217 00:19:24.974091 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:19:54 crc kubenswrapper[4791]: I0217 00:19:54.973340 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:19:54 crc kubenswrapper[4791]: I0217 00:19:54.974307 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:19:54 crc kubenswrapper[4791]: I0217 00:19:54.974378 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:19:54 crc kubenswrapper[4791]: I0217 00:19:54.975289 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:19:54 crc kubenswrapper[4791]: I0217 00:19:54.975388 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb" gracePeriod=600 Feb 17 00:19:55 crc kubenswrapper[4791]: I0217 00:19:55.499374 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb" exitCode=0 Feb 17 00:19:55 crc kubenswrapper[4791]: I0217 00:19:55.499514 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb"} Feb 17 00:19:55 crc kubenswrapper[4791]: I0217 00:19:55.499787 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64"} Feb 17 00:19:55 crc kubenswrapper[4791]: I0217 00:19:55.499818 4791 scope.go:117] "RemoveContainer" containerID="5711491f64a78b6ffc6b1c8acbd4b44d9c0260c393f0431c6d0856df0b4e56a4" Feb 17 00:20:04 crc kubenswrapper[4791]: I0217 00:20:04.567619 4791 generic.go:334] "Generic (PLEG): container finished" podID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerID="140e21b65fbe14e821c2dad12e49571665b14cf6c8f81e58c83c173b7ce95626" exitCode=0 Feb 17 00:20:04 crc kubenswrapper[4791]: I0217 00:20:04.567673 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerDied","Data":"140e21b65fbe14e821c2dad12e49571665b14cf6c8f81e58c83c173b7ce95626"} Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.508418 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.511165 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.525324 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.591382 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db8wq\" (UniqueName: \"kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.592129 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.592362 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.693371 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.693422 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.693457 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db8wq\" (UniqueName: \"kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.694169 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.694504 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.720659 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db8wq\" (UniqueName: \"kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq\") pod \"certified-operators-nrrr6\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.809246 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.871086 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996252 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996319 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996352 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996382 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996415 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996441 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d889h\" (UniqueName: \"kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996446 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996479 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996550 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996591 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996623 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996673 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996696 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles\") pod \"f00d7586-332c-485f-b171-5b3f4f7a0728\" (UID: \"f00d7586-332c-485f-b171-5b3f4f7a0728\") " Feb 17 00:20:05 crc kubenswrapper[4791]: I0217 00:20:05.996975 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:05.997348 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:05.997352 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:05.999150 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.002823 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.003315 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.005386 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.006795 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h" (OuterVolumeSpecName: "kube-api-access-d889h") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "kube-api-access-d889h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.020281 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.040198 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.096099 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.097936 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.097960 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.097973 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.097985 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d889h\" (UniqueName: \"kubernetes.io/projected/f00d7586-332c-485f-b171-5b3f4f7a0728-kube-api-access-d889h\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.097998 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f00d7586-332c-485f-b171-5b3f4f7a0728-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.098010 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.098022 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f00d7586-332c-485f-b171-5b3f4f7a0728-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.098034 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.098045 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00d7586-332c-485f-b171-5b3f4f7a0728-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.197453 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.205913 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.582280 4791 generic.go:334] "Generic (PLEG): container finished" podID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerID="91023745db29e801135dc6120120cb42a05ec1d76bda30737b7f087a1e9aa42c" exitCode=0 Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.582780 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerDied","Data":"91023745db29e801135dc6120120cb42a05ec1d76bda30737b7f087a1e9aa42c"} Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.582838 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerStarted","Data":"dfba4d631ae2af8e9109e32d876e3a5df34446a9b4d6fbf497cbfc4a9b323fe2"} Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.587976 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"f00d7586-332c-485f-b171-5b3f4f7a0728","Type":"ContainerDied","Data":"6360b057a63c6515bb6e37784e1d42fc2bc72fdee92da431ce8d64435bcc1761"} Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.588021 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6360b057a63c6515bb6e37784e1d42fc2bc72fdee92da431ce8d64435bcc1761" Feb 17 00:20:06 crc kubenswrapper[4791]: I0217 00:20:06.588169 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 17 00:20:07 crc kubenswrapper[4791]: I0217 00:20:07.597757 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerStarted","Data":"e00bb2ccc62911d16b1cbd70fbc3e10594268dc34b98b4c3ad2cbcf7038bd13f"} Feb 17 00:20:08 crc kubenswrapper[4791]: I0217 00:20:08.052100 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f00d7586-332c-485f-b171-5b3f4f7a0728" (UID: "f00d7586-332c-485f-b171-5b3f4f7a0728"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:08 crc kubenswrapper[4791]: I0217 00:20:08.134480 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f00d7586-332c-485f-b171-5b3f4f7a0728-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:08 crc kubenswrapper[4791]: I0217 00:20:08.609925 4791 generic.go:334] "Generic (PLEG): container finished" podID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerID="e00bb2ccc62911d16b1cbd70fbc3e10594268dc34b98b4c3ad2cbcf7038bd13f" exitCode=0 Feb 17 00:20:08 crc kubenswrapper[4791]: I0217 00:20:08.609985 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerDied","Data":"e00bb2ccc62911d16b1cbd70fbc3e10594268dc34b98b4c3ad2cbcf7038bd13f"} Feb 17 00:20:09 crc kubenswrapper[4791]: I0217 00:20:09.619398 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerStarted","Data":"14ead6ec9a53495cd0ea78d494f6f2c8c194f21dbb61943e7f960a5569b61ffe"} Feb 17 00:20:09 crc kubenswrapper[4791]: I0217 00:20:09.643405 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nrrr6" podStartSLOduration=2.11913963 podStartE2EDuration="4.643381566s" podCreationTimestamp="2026-02-17 00:20:05 +0000 UTC" firstStartedPulling="2026-02-17 00:20:06.585195676 +0000 UTC m=+864.064708213" lastFinishedPulling="2026-02-17 00:20:09.109437622 +0000 UTC m=+866.588950149" observedRunningTime="2026-02-17 00:20:09.636875011 +0000 UTC m=+867.116387538" watchObservedRunningTime="2026-02-17 00:20:09.643381566 +0000 UTC m=+867.122894103" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.292437 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:10 crc kubenswrapper[4791]: E0217 00:20:10.292736 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="manage-dockerfile" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.292755 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="manage-dockerfile" Feb 17 00:20:10 crc kubenswrapper[4791]: E0217 00:20:10.292778 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="docker-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.292785 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="docker-build" Feb 17 00:20:10 crc kubenswrapper[4791]: E0217 00:20:10.292794 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="git-clone" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.292802 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="git-clone" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.292927 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00d7586-332c-485f-b171-5b3f4f7a0728" containerName="docker-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.293664 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.296323 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.296451 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.297010 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.298249 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.323479 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467023 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467066 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467089 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467133 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467169 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467201 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467223 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvllj\" (UniqueName: \"kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467240 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467273 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467300 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467332 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.467351 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569061 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569225 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569287 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvllj\" (UniqueName: \"kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569347 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569394 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569439 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569508 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569515 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569550 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569737 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569821 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569891 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.569995 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.570425 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.570582 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.570643 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.570607 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.571609 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.572405 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.573048 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.573780 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.581639 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.584924 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.597772 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvllj\" (UniqueName: \"kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj\") pod \"smart-gateway-operator-1-build\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:10 crc kubenswrapper[4791]: I0217 00:20:10.607611 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:11 crc kubenswrapper[4791]: I0217 00:20:11.085007 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:11 crc kubenswrapper[4791]: I0217 00:20:11.657039 4791 generic.go:334] "Generic (PLEG): container finished" podID="bde8eea2-068e-4791-ad76-164945e7d646" containerID="eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2" exitCode=0 Feb 17 00:20:11 crc kubenswrapper[4791]: I0217 00:20:11.657188 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"bde8eea2-068e-4791-ad76-164945e7d646","Type":"ContainerDied","Data":"eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2"} Feb 17 00:20:11 crc kubenswrapper[4791]: I0217 00:20:11.657594 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"bde8eea2-068e-4791-ad76-164945e7d646","Type":"ContainerStarted","Data":"1ae461848c21aee33759740caf6db92789b07c617851ddd28d9629d261bba1f0"} Feb 17 00:20:12 crc kubenswrapper[4791]: I0217 00:20:12.666879 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"bde8eea2-068e-4791-ad76-164945e7d646","Type":"ContainerStarted","Data":"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6"} Feb 17 00:20:12 crc kubenswrapper[4791]: I0217 00:20:12.701716 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=2.701686761 podStartE2EDuration="2.701686761s" podCreationTimestamp="2026-02-17 00:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:20:12.69816186 +0000 UTC m=+870.177674387" watchObservedRunningTime="2026-02-17 00:20:12.701686761 +0000 UTC m=+870.181199328" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.631726 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-djrqd"] Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.634613 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.655333 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djrqd"] Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.812813 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsq4j\" (UniqueName: \"kubernetes.io/projected/79b4304a-5553-411d-a6df-e2af898a22b0-kube-api-access-rsq4j\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.813289 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-utilities\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.813317 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-catalog-content\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.915068 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsq4j\" (UniqueName: \"kubernetes.io/projected/79b4304a-5553-411d-a6df-e2af898a22b0-kube-api-access-rsq4j\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.915223 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-utilities\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.915248 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-catalog-content\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.915749 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-utilities\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.915802 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4304a-5553-411d-a6df-e2af898a22b0-catalog-content\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.933855 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsq4j\" (UniqueName: \"kubernetes.io/projected/79b4304a-5553-411d-a6df-e2af898a22b0-kube-api-access-rsq4j\") pod \"community-operators-djrqd\" (UID: \"79b4304a-5553-411d-a6df-e2af898a22b0\") " pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:13 crc kubenswrapper[4791]: I0217 00:20:13.955253 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:14 crc kubenswrapper[4791]: I0217 00:20:14.236493 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djrqd"] Feb 17 00:20:14 crc kubenswrapper[4791]: W0217 00:20:14.240648 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b4304a_5553_411d_a6df_e2af898a22b0.slice/crio-4a3df3f42d7dbafcfb3a603f7123b1f83835810b6656fb68e16d388e80383efd WatchSource:0}: Error finding container 4a3df3f42d7dbafcfb3a603f7123b1f83835810b6656fb68e16d388e80383efd: Status 404 returned error can't find the container with id 4a3df3f42d7dbafcfb3a603f7123b1f83835810b6656fb68e16d388e80383efd Feb 17 00:20:14 crc kubenswrapper[4791]: I0217 00:20:14.681496 4791 generic.go:334] "Generic (PLEG): container finished" podID="79b4304a-5553-411d-a6df-e2af898a22b0" containerID="6265710aa1598cf2760a216bcb97d3c1d36d120dc17c36049999d2b40b284834" exitCode=0 Feb 17 00:20:14 crc kubenswrapper[4791]: I0217 00:20:14.681598 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrqd" event={"ID":"79b4304a-5553-411d-a6df-e2af898a22b0","Type":"ContainerDied","Data":"6265710aa1598cf2760a216bcb97d3c1d36d120dc17c36049999d2b40b284834"} Feb 17 00:20:14 crc kubenswrapper[4791]: I0217 00:20:14.681653 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrqd" event={"ID":"79b4304a-5553-411d-a6df-e2af898a22b0","Type":"ContainerStarted","Data":"4a3df3f42d7dbafcfb3a603f7123b1f83835810b6656fb68e16d388e80383efd"} Feb 17 00:20:15 crc kubenswrapper[4791]: I0217 00:20:15.871692 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:15 crc kubenswrapper[4791]: I0217 00:20:15.872190 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:15 crc kubenswrapper[4791]: I0217 00:20:15.919243 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:16 crc kubenswrapper[4791]: I0217 00:20:16.731756 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:16 crc kubenswrapper[4791]: I0217 00:20:16.992536 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:18 crc kubenswrapper[4791]: I0217 00:20:18.710268 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nrrr6" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="registry-server" containerID="cri-o://14ead6ec9a53495cd0ea78d494f6f2c8c194f21dbb61943e7f960a5569b61ffe" gracePeriod=2 Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.718744 4791 generic.go:334] "Generic (PLEG): container finished" podID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerID="14ead6ec9a53495cd0ea78d494f6f2c8c194f21dbb61943e7f960a5569b61ffe" exitCode=0 Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.718885 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerDied","Data":"14ead6ec9a53495cd0ea78d494f6f2c8c194f21dbb61943e7f960a5569b61ffe"} Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.721367 4791 generic.go:334] "Generic (PLEG): container finished" podID="79b4304a-5553-411d-a6df-e2af898a22b0" containerID="b1e70ef860385035e08b05e1dac681a8ccfb5e9679b68717933094c9c7d4c761" exitCode=0 Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.721429 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrqd" event={"ID":"79b4304a-5553-411d-a6df-e2af898a22b0","Type":"ContainerDied","Data":"b1e70ef860385035e08b05e1dac681a8ccfb5e9679b68717933094c9c7d4c761"} Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.874528 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.998263 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content\") pod \"e466cdef-0ad2-4536-a50f-e323c91438dd\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.998339 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities\") pod \"e466cdef-0ad2-4536-a50f-e323c91438dd\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.998469 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db8wq\" (UniqueName: \"kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq\") pod \"e466cdef-0ad2-4536-a50f-e323c91438dd\" (UID: \"e466cdef-0ad2-4536-a50f-e323c91438dd\") " Feb 17 00:20:19 crc kubenswrapper[4791]: I0217 00:20:19.999334 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities" (OuterVolumeSpecName: "utilities") pod "e466cdef-0ad2-4536-a50f-e323c91438dd" (UID: "e466cdef-0ad2-4536-a50f-e323c91438dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.007276 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq" (OuterVolumeSpecName: "kube-api-access-db8wq") pod "e466cdef-0ad2-4536-a50f-e323c91438dd" (UID: "e466cdef-0ad2-4536-a50f-e323c91438dd"). InnerVolumeSpecName "kube-api-access-db8wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.060187 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e466cdef-0ad2-4536-a50f-e323c91438dd" (UID: "e466cdef-0ad2-4536-a50f-e323c91438dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.100517 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.100578 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e466cdef-0ad2-4536-a50f-e323c91438dd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.100599 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db8wq\" (UniqueName: \"kubernetes.io/projected/e466cdef-0ad2-4536-a50f-e323c91438dd-kube-api-access-db8wq\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.730310 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrrr6" event={"ID":"e466cdef-0ad2-4536-a50f-e323c91438dd","Type":"ContainerDied","Data":"dfba4d631ae2af8e9109e32d876e3a5df34446a9b4d6fbf497cbfc4a9b323fe2"} Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.730335 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrrr6" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.730666 4791 scope.go:117] "RemoveContainer" containerID="14ead6ec9a53495cd0ea78d494f6f2c8c194f21dbb61943e7f960a5569b61ffe" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.733746 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djrqd" event={"ID":"79b4304a-5553-411d-a6df-e2af898a22b0","Type":"ContainerStarted","Data":"155fd9bdac7e97704fca02965c46c1bb92205e62adecfc173559e54d00730649"} Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.746429 4791 scope.go:117] "RemoveContainer" containerID="e00bb2ccc62911d16b1cbd70fbc3e10594268dc34b98b4c3ad2cbcf7038bd13f" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.762848 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-djrqd" podStartSLOduration=2.158952874 podStartE2EDuration="7.762828474s" podCreationTimestamp="2026-02-17 00:20:13 +0000 UTC" firstStartedPulling="2026-02-17 00:20:14.697889501 +0000 UTC m=+872.177402028" lastFinishedPulling="2026-02-17 00:20:20.301765101 +0000 UTC m=+877.781277628" observedRunningTime="2026-02-17 00:20:20.755444382 +0000 UTC m=+878.234956929" watchObservedRunningTime="2026-02-17 00:20:20.762828474 +0000 UTC m=+878.242341011" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.777204 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.783382 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nrrr6"] Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.791544 4791 scope.go:117] "RemoveContainer" containerID="91023745db29e801135dc6120120cb42a05ec1d76bda30737b7f087a1e9aa42c" Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.835394 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:20 crc kubenswrapper[4791]: I0217 00:20:20.835673 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="docker-build" containerID="cri-o://4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6" gracePeriod=30 Feb 17 00:20:21 crc kubenswrapper[4791]: I0217 00:20:21.234796 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" path="/var/lib/kubelet/pods/e466cdef-0ad2-4536-a50f-e323c91438dd/volumes" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.416314 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Feb 17 00:20:22 crc kubenswrapper[4791]: E0217 00:20:22.416828 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="extract-utilities" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.416842 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="extract-utilities" Feb 17 00:20:22 crc kubenswrapper[4791]: E0217 00:20:22.416853 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="extract-content" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.416861 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="extract-content" Feb 17 00:20:22 crc kubenswrapper[4791]: E0217 00:20:22.416878 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="registry-server" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.416887 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="registry-server" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.417008 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e466cdef-0ad2-4536-a50f-e323c91438dd" containerName="registry-server" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.418067 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.419889 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.419909 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.421872 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433364 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm2bn\" (UniqueName: \"kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433500 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433570 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433655 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433711 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433759 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.433932 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.434005 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.434079 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.434246 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.434352 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.434398 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.437023 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.535966 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536005 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536026 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536045 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536076 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536099 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536133 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536164 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536193 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536210 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536240 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm2bn\" (UniqueName: \"kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536262 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536200 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536449 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536654 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536729 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.536952 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.537186 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.537270 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.537479 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.537661 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.541794 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.549000 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.555566 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm2bn\" (UniqueName: \"kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn\") pod \"smart-gateway-operator-2-build\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:22 crc kubenswrapper[4791]: I0217 00:20:22.801005 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.104944 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.156338 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_bde8eea2-068e-4791-ad76-164945e7d646/docker-build/0.log" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.157188 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249230 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249280 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249324 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249360 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249396 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249386 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249419 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249452 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249482 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249507 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249555 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvllj\" (UniqueName: \"kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249622 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249648 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir\") pod \"bde8eea2-068e-4791-ad76-164945e7d646\" (UID: \"bde8eea2-068e-4791-ad76-164945e7d646\") " Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.249908 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.250248 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.250269 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.250557 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.250741 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.251246 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.251301 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.251518 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.255835 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj" (OuterVolumeSpecName: "kube-api-access-mvllj") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "kube-api-access-mvllj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.256395 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.260361 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351658 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351708 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351728 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351745 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351764 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bde8eea2-068e-4791-ad76-164945e7d646-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351783 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde8eea2-068e-4791-ad76-164945e7d646-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351799 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde8eea2-068e-4791-ad76-164945e7d646-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.351816 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvllj\" (UniqueName: \"kubernetes.io/projected/bde8eea2-068e-4791-ad76-164945e7d646-kube-api-access-mvllj\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.402100 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.453493 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.699248 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "bde8eea2-068e-4791-ad76-164945e7d646" (UID: "bde8eea2-068e-4791-ad76-164945e7d646"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.757388 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde8eea2-068e-4791-ad76-164945e7d646-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.769680 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerStarted","Data":"6bc390cf261ee2ce905ed54511ecd2d4889323ecd586e5525896cd466846b745"} Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.769734 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerStarted","Data":"89476eba0b9817ae2dd5c5b95b35b38fe9d88bc0388a005c9bcf9bbec9490271"} Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772043 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_bde8eea2-068e-4791-ad76-164945e7d646/docker-build/0.log" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772493 4791 generic.go:334] "Generic (PLEG): container finished" podID="bde8eea2-068e-4791-ad76-164945e7d646" containerID="4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6" exitCode=1 Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772515 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"bde8eea2-068e-4791-ad76-164945e7d646","Type":"ContainerDied","Data":"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6"} Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772530 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"bde8eea2-068e-4791-ad76-164945e7d646","Type":"ContainerDied","Data":"1ae461848c21aee33759740caf6db92789b07c617851ddd28d9629d261bba1f0"} Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772546 4791 scope.go:117] "RemoveContainer" containerID="4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.772639 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.832330 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.850640 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.866481 4791 scope.go:117] "RemoveContainer" containerID="eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.897603 4791 scope.go:117] "RemoveContainer" containerID="4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6" Feb 17 00:20:23 crc kubenswrapper[4791]: E0217 00:20:23.897975 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6\": container with ID starting with 4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6 not found: ID does not exist" containerID="4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.898014 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6"} err="failed to get container status \"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6\": rpc error: code = NotFound desc = could not find container \"4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6\": container with ID starting with 4925b2afd79484d489002a7091c946fb9622887c9e930a4e33e8eb6ee1a634c6 not found: ID does not exist" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.898044 4791 scope.go:117] "RemoveContainer" containerID="eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2" Feb 17 00:20:23 crc kubenswrapper[4791]: E0217 00:20:23.898341 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2\": container with ID starting with eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2 not found: ID does not exist" containerID="eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.898367 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2"} err="failed to get container status \"eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2\": rpc error: code = NotFound desc = could not find container \"eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2\": container with ID starting with eaf1c3ad960a372f759866559e13a737dcc65f7bae2c093e276d9d26d0d0edb2 not found: ID does not exist" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.956295 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.956425 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:23 crc kubenswrapper[4791]: I0217 00:20:23.995288 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:24 crc kubenswrapper[4791]: I0217 00:20:24.790709 4791 generic.go:334] "Generic (PLEG): container finished" podID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerID="6bc390cf261ee2ce905ed54511ecd2d4889323ecd586e5525896cd466846b745" exitCode=0 Feb 17 00:20:24 crc kubenswrapper[4791]: I0217 00:20:24.790773 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerDied","Data":"6bc390cf261ee2ce905ed54511ecd2d4889323ecd586e5525896cd466846b745"} Feb 17 00:20:25 crc kubenswrapper[4791]: I0217 00:20:25.232428 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde8eea2-068e-4791-ad76-164945e7d646" path="/var/lib/kubelet/pods/bde8eea2-068e-4791-ad76-164945e7d646/volumes" Feb 17 00:20:25 crc kubenswrapper[4791]: I0217 00:20:25.871260 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-djrqd" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.083484 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djrqd"] Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.127979 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.128263 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v6qjq" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="registry-server" containerID="cri-o://ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1" gracePeriod=2 Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.487949 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.592025 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content\") pod \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.592075 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities\") pod \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.592222 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-729vq\" (UniqueName: \"kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq\") pod \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\" (UID: \"eebe5038-a970-42a4-81d4-fa84e6a64dd2\") " Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.593591 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities" (OuterVolumeSpecName: "utilities") pod "eebe5038-a970-42a4-81d4-fa84e6a64dd2" (UID: "eebe5038-a970-42a4-81d4-fa84e6a64dd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.597575 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq" (OuterVolumeSpecName: "kube-api-access-729vq") pod "eebe5038-a970-42a4-81d4-fa84e6a64dd2" (UID: "eebe5038-a970-42a4-81d4-fa84e6a64dd2"). InnerVolumeSpecName "kube-api-access-729vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.643491 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eebe5038-a970-42a4-81d4-fa84e6a64dd2" (UID: "eebe5038-a970-42a4-81d4-fa84e6a64dd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.693121 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-729vq\" (UniqueName: \"kubernetes.io/projected/eebe5038-a970-42a4-81d4-fa84e6a64dd2-kube-api-access-729vq\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.693150 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.693161 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebe5038-a970-42a4-81d4-fa84e6a64dd2-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.816604 4791 generic.go:334] "Generic (PLEG): container finished" podID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerID="e80128347dc7c950dc836d68c8b67a7cc01a8e43fb8f74b24f82416156d6c0c1" exitCode=0 Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.816689 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerDied","Data":"e80128347dc7c950dc836d68c8b67a7cc01a8e43fb8f74b24f82416156d6c0c1"} Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.818948 4791 generic.go:334] "Generic (PLEG): container finished" podID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerID="ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1" exitCode=0 Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.819030 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6qjq" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.819072 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerDied","Data":"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1"} Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.819133 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6qjq" event={"ID":"eebe5038-a970-42a4-81d4-fa84e6a64dd2","Type":"ContainerDied","Data":"c8fc013d2e008f0f951114c42fdaed6119c2933ea57de36f1dcfd1ae325e546c"} Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.819158 4791 scope.go:117] "RemoveContainer" containerID="ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.852450 4791 scope.go:117] "RemoveContainer" containerID="d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.853428 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_bcdc829d-2304-4576-8cdb-b6a15b577e54/manage-dockerfile/0.log" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.866599 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.871241 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v6qjq"] Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.885602 4791 scope.go:117] "RemoveContainer" containerID="737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.909229 4791 scope.go:117] "RemoveContainer" containerID="ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1" Feb 17 00:20:26 crc kubenswrapper[4791]: E0217 00:20:26.909583 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1\": container with ID starting with ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1 not found: ID does not exist" containerID="ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.909624 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1"} err="failed to get container status \"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1\": rpc error: code = NotFound desc = could not find container \"ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1\": container with ID starting with ed7b094d4b0af5cbe89e9c6da70af29a27e1cfc9a98b0fd6d3269416ecfd81c1 not found: ID does not exist" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.909651 4791 scope.go:117] "RemoveContainer" containerID="d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c" Feb 17 00:20:26 crc kubenswrapper[4791]: E0217 00:20:26.909860 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c\": container with ID starting with d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c not found: ID does not exist" containerID="d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.909880 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c"} err="failed to get container status \"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c\": rpc error: code = NotFound desc = could not find container \"d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c\": container with ID starting with d69583cc2d298b1daf6617fbad84270f1593cb5a67be3d06e8883cb64b515f1c not found: ID does not exist" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.909893 4791 scope.go:117] "RemoveContainer" containerID="737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d" Feb 17 00:20:26 crc kubenswrapper[4791]: E0217 00:20:26.911704 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d\": container with ID starting with 737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d not found: ID does not exist" containerID="737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d" Feb 17 00:20:26 crc kubenswrapper[4791]: I0217 00:20:26.911730 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d"} err="failed to get container status \"737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d\": rpc error: code = NotFound desc = could not find container \"737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d\": container with ID starting with 737af514e798ea92190e562f1ad92c8289073df535ff899a9da15a87c241f33d not found: ID does not exist" Feb 17 00:20:27 crc kubenswrapper[4791]: I0217 00:20:27.226699 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" path="/var/lib/kubelet/pods/eebe5038-a970-42a4-81d4-fa84e6a64dd2/volumes" Feb 17 00:20:27 crc kubenswrapper[4791]: I0217 00:20:27.829232 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerStarted","Data":"29e936b5fcf7ecd6131859059d0b7e9f208b819211cfcd228bed9247a6317ed7"} Feb 17 00:20:27 crc kubenswrapper[4791]: I0217 00:20:27.859471 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.85944953 podStartE2EDuration="5.85944953s" podCreationTimestamp="2026-02-17 00:20:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:20:27.855617091 +0000 UTC m=+885.335129648" watchObservedRunningTime="2026-02-17 00:20:27.85944953 +0000 UTC m=+885.338962067" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.776786 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:06 crc kubenswrapper[4791]: E0217 00:21:06.777692 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="extract-content" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.777713 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="extract-content" Feb 17 00:21:06 crc kubenswrapper[4791]: E0217 00:21:06.777742 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="docker-build" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.777754 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="docker-build" Feb 17 00:21:06 crc kubenswrapper[4791]: E0217 00:21:06.777777 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="registry-server" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.777790 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="registry-server" Feb 17 00:21:06 crc kubenswrapper[4791]: E0217 00:21:06.777806 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="manage-dockerfile" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.777817 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="manage-dockerfile" Feb 17 00:21:06 crc kubenswrapper[4791]: E0217 00:21:06.777837 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="extract-utilities" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.777849 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="extract-utilities" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.778032 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="eebe5038-a970-42a4-81d4-fa84e6a64dd2" containerName="registry-server" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.778057 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde8eea2-068e-4791-ad76-164945e7d646" containerName="docker-build" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.779446 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.801911 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.967877 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.968009 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:06 crc kubenswrapper[4791]: I0217 00:21:06.968030 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.068889 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.068952 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.069016 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.069641 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.069663 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.095178 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc\") pod \"redhat-operators-44n5c\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.110562 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:07 crc kubenswrapper[4791]: I0217 00:21:07.613637 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:08 crc kubenswrapper[4791]: I0217 00:21:08.126224 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerStarted","Data":"23b09b406bee148561e1d38cf362a00225c7b1ed620ff54a974176c895c0e301"} Feb 17 00:21:09 crc kubenswrapper[4791]: I0217 00:21:09.132963 4791 generic.go:334] "Generic (PLEG): container finished" podID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerID="0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f" exitCode=0 Feb 17 00:21:09 crc kubenswrapper[4791]: I0217 00:21:09.133010 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerDied","Data":"0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f"} Feb 17 00:21:11 crc kubenswrapper[4791]: I0217 00:21:11.144978 4791 generic.go:334] "Generic (PLEG): container finished" podID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerID="de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05" exitCode=0 Feb 17 00:21:11 crc kubenswrapper[4791]: I0217 00:21:11.145050 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerDied","Data":"de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05"} Feb 17 00:21:12 crc kubenswrapper[4791]: I0217 00:21:12.152906 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerStarted","Data":"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71"} Feb 17 00:21:12 crc kubenswrapper[4791]: I0217 00:21:12.177967 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-44n5c" podStartSLOduration=3.708926854 podStartE2EDuration="6.177944743s" podCreationTimestamp="2026-02-17 00:21:06 +0000 UTC" firstStartedPulling="2026-02-17 00:21:09.136626164 +0000 UTC m=+926.616138691" lastFinishedPulling="2026-02-17 00:21:11.605644053 +0000 UTC m=+929.085156580" observedRunningTime="2026-02-17 00:21:12.173379903 +0000 UTC m=+929.652892430" watchObservedRunningTime="2026-02-17 00:21:12.177944743 +0000 UTC m=+929.657457270" Feb 17 00:21:17 crc kubenswrapper[4791]: I0217 00:21:17.111556 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:17 crc kubenswrapper[4791]: I0217 00:21:17.111970 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:18 crc kubenswrapper[4791]: I0217 00:21:18.156734 4791 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-44n5c" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="registry-server" probeResult="failure" output=< Feb 17 00:21:18 crc kubenswrapper[4791]: timeout: failed to connect service ":50051" within 1s Feb 17 00:21:18 crc kubenswrapper[4791]: > Feb 17 00:21:27 crc kubenswrapper[4791]: I0217 00:21:27.190926 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:27 crc kubenswrapper[4791]: I0217 00:21:27.251822 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:27 crc kubenswrapper[4791]: I0217 00:21:27.428480 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.278552 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-44n5c" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="registry-server" containerID="cri-o://bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71" gracePeriod=2 Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.675791 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.809536 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities\") pod \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.809603 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content\") pod \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.809720 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc\") pod \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\" (UID: \"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b\") " Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.810795 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities" (OuterVolumeSpecName: "utilities") pod "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" (UID: "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.815971 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc" (OuterVolumeSpecName: "kube-api-access-vckfc") pod "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" (UID: "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b"). InnerVolumeSpecName "kube-api-access-vckfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.911371 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.911400 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vckfc\" (UniqueName: \"kubernetes.io/projected/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-kube-api-access-vckfc\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:28 crc kubenswrapper[4791]: I0217 00:21:28.964757 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" (UID: "eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.012628 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.288144 4791 generic.go:334] "Generic (PLEG): container finished" podID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerID="bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71" exitCode=0 Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.288192 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerDied","Data":"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71"} Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.288262 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n5c" event={"ID":"eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b","Type":"ContainerDied","Data":"23b09b406bee148561e1d38cf362a00225c7b1ed620ff54a974176c895c0e301"} Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.288291 4791 scope.go:117] "RemoveContainer" containerID="bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.288220 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n5c" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.309493 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.315294 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-44n5c"] Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.316369 4791 scope.go:117] "RemoveContainer" containerID="de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.357485 4791 scope.go:117] "RemoveContainer" containerID="0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.377142 4791 scope.go:117] "RemoveContainer" containerID="bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71" Feb 17 00:21:29 crc kubenswrapper[4791]: E0217 00:21:29.377689 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71\": container with ID starting with bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71 not found: ID does not exist" containerID="bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.377726 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71"} err="failed to get container status \"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71\": rpc error: code = NotFound desc = could not find container \"bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71\": container with ID starting with bb8cfac13451f44d607670bb6bb11b3ec6e0db4cc96bac02634c96443a0a9a71 not found: ID does not exist" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.377752 4791 scope.go:117] "RemoveContainer" containerID="de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05" Feb 17 00:21:29 crc kubenswrapper[4791]: E0217 00:21:29.378042 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05\": container with ID starting with de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05 not found: ID does not exist" containerID="de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.378070 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05"} err="failed to get container status \"de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05\": rpc error: code = NotFound desc = could not find container \"de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05\": container with ID starting with de3491349309f815955ef67c9b840a85dc5d3ae5d10a00c027b12e1f08e03d05 not found: ID does not exist" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.378087 4791 scope.go:117] "RemoveContainer" containerID="0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f" Feb 17 00:21:29 crc kubenswrapper[4791]: E0217 00:21:29.378443 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f\": container with ID starting with 0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f not found: ID does not exist" containerID="0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f" Feb 17 00:21:29 crc kubenswrapper[4791]: I0217 00:21:29.378485 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f"} err="failed to get container status \"0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f\": rpc error: code = NotFound desc = could not find container \"0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f\": container with ID starting with 0118fcc105e64c4cd456b945c0f06ad0c5edd0e993300aa490662d727a49395f not found: ID does not exist" Feb 17 00:21:31 crc kubenswrapper[4791]: I0217 00:21:31.228421 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" path="/var/lib/kubelet/pods/eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b/volumes" Feb 17 00:21:34 crc kubenswrapper[4791]: I0217 00:21:34.323975 4791 generic.go:334] "Generic (PLEG): container finished" podID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerID="29e936b5fcf7ecd6131859059d0b7e9f208b819211cfcd228bed9247a6317ed7" exitCode=0 Feb 17 00:21:34 crc kubenswrapper[4791]: I0217 00:21:34.324036 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerDied","Data":"29e936b5fcf7ecd6131859059d0b7e9f208b819211cfcd228bed9247a6317ed7"} Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.617249 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800432 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800549 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800592 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800627 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800714 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800783 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800830 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800901 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.800945 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.801014 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.801071 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm2bn\" (UniqueName: \"kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.801200 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.801256 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs\") pod \"bcdc829d-2304-4576-8cdb-b6a15b577e54\" (UID: \"bcdc829d-2304-4576-8cdb-b6a15b577e54\") " Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.801702 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.802560 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.803360 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.803970 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.804769 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.805138 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.805553 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.807727 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.808753 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.809486 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn" (OuterVolumeSpecName: "kube-api-access-zm2bn") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "kube-api-access-zm2bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902694 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902731 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902744 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902755 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902766 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902778 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/bcdc829d-2304-4576-8cdb-b6a15b577e54-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902788 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcdc829d-2304-4576-8cdb-b6a15b577e54-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902800 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:35 crc kubenswrapper[4791]: I0217 00:21:35.902811 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm2bn\" (UniqueName: \"kubernetes.io/projected/bcdc829d-2304-4576-8cdb-b6a15b577e54-kube-api-access-zm2bn\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:36 crc kubenswrapper[4791]: I0217 00:21:36.001064 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:36 crc kubenswrapper[4791]: I0217 00:21:36.003521 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:36 crc kubenswrapper[4791]: I0217 00:21:36.340665 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"bcdc829d-2304-4576-8cdb-b6a15b577e54","Type":"ContainerDied","Data":"89476eba0b9817ae2dd5c5b95b35b38fe9d88bc0388a005c9bcf9bbec9490271"} Feb 17 00:21:36 crc kubenswrapper[4791]: I0217 00:21:36.341141 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89476eba0b9817ae2dd5c5b95b35b38fe9d88bc0388a005c9bcf9bbec9490271" Feb 17 00:21:36 crc kubenswrapper[4791]: I0217 00:21:36.340720 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 17 00:21:37 crc kubenswrapper[4791]: I0217 00:21:37.997862 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "bcdc829d-2304-4576-8cdb-b6a15b577e54" (UID: "bcdc829d-2304-4576-8cdb-b6a15b577e54"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:38 crc kubenswrapper[4791]: I0217 00:21:38.052994 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bcdc829d-2304-4576-8cdb-b6a15b577e54-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.313083 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314045 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="manage-dockerfile" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314077 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="manage-dockerfile" Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314102 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="extract-utilities" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314149 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="extract-utilities" Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314167 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="extract-content" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314183 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="extract-content" Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314209 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="git-clone" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314224 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="git-clone" Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314257 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="docker-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314272 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="docker-build" Feb 17 00:21:40 crc kubenswrapper[4791]: E0217 00:21:40.314294 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="registry-server" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314309 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="registry-server" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314553 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb7cace6-3484-4eb2-9f1d-bbb1ea8da31b" containerName="registry-server" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.314586 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcdc829d-2304-4576-8cdb-b6a15b577e54" containerName="docker-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.315852 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.318318 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.319373 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.319837 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.320441 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.333990 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381248 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381321 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381393 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381635 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381763 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381830 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdzws\" (UniqueName: \"kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381898 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.381965 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.382087 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.382208 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.382283 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.382377 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483144 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483406 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483497 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483577 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483516 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483707 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483783 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483867 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.483935 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484021 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484102 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484220 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484279 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484278 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484301 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdzws\" (UniqueName: \"kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484397 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484421 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.484699 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.485013 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.485037 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.485647 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.489860 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.492738 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.508499 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdzws\" (UniqueName: \"kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws\") pod \"sg-core-1-build\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " pod="service-telemetry/sg-core-1-build" Feb 17 00:21:40 crc kubenswrapper[4791]: I0217 00:21:40.635391 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 17 00:21:41 crc kubenswrapper[4791]: I0217 00:21:41.078367 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:41 crc kubenswrapper[4791]: I0217 00:21:41.370841 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerStarted","Data":"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4"} Feb 17 00:21:41 crc kubenswrapper[4791]: I0217 00:21:41.371967 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerStarted","Data":"cbdee32da39609493a5fdab3853bbd124ef1afb74d05c58293494e87a42008c0"} Feb 17 00:21:42 crc kubenswrapper[4791]: I0217 00:21:42.383187 4791 generic.go:334] "Generic (PLEG): container finished" podID="f212a215-55c7-48e3-a353-e0b74a390123" containerID="f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4" exitCode=0 Feb 17 00:21:42 crc kubenswrapper[4791]: I0217 00:21:42.383313 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerDied","Data":"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4"} Feb 17 00:21:43 crc kubenswrapper[4791]: I0217 00:21:43.394672 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerStarted","Data":"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa"} Feb 17 00:21:43 crc kubenswrapper[4791]: I0217 00:21:43.429972 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=3.429947323 podStartE2EDuration="3.429947323s" podCreationTimestamp="2026-02-17 00:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:21:43.429500719 +0000 UTC m=+960.909013266" watchObservedRunningTime="2026-02-17 00:21:43.429947323 +0000 UTC m=+960.909459860" Feb 17 00:21:50 crc kubenswrapper[4791]: I0217 00:21:50.908030 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:50 crc kubenswrapper[4791]: I0217 00:21:50.908910 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="docker-build" containerID="cri-o://e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa" gracePeriod=30 Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.264715 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_f212a215-55c7-48e3-a353-e0b74a390123/docker-build/0.log" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.265637 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.353732 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354302 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354338 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354390 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354481 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354575 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354629 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354695 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354728 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354803 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdzws\" (UniqueName: \"kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354845 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.354879 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs\") pod \"f212a215-55c7-48e3-a353-e0b74a390123\" (UID: \"f212a215-55c7-48e3-a353-e0b74a390123\") " Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355191 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355206 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355265 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355642 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355673 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355687 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355880 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.355955 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.356673 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.358914 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.363680 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws" (OuterVolumeSpecName: "kube-api-access-kdzws") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "kube-api-access-kdzws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.363711 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.364828 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.457357 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.457875 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.457962 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.458056 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdzws\" (UniqueName: \"kubernetes.io/projected/f212a215-55c7-48e3-a353-e0b74a390123-kube-api-access-kdzws\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.458152 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f212a215-55c7-48e3-a353-e0b74a390123-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.458217 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f212a215-55c7-48e3-a353-e0b74a390123-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.458275 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/f212a215-55c7-48e3-a353-e0b74a390123-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.459533 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.468899 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_f212a215-55c7-48e3-a353-e0b74a390123/docker-build/0.log" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.469815 4791 generic.go:334] "Generic (PLEG): container finished" podID="f212a215-55c7-48e3-a353-e0b74a390123" containerID="e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa" exitCode=1 Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.469955 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerDied","Data":"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa"} Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.470056 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f212a215-55c7-48e3-a353-e0b74a390123","Type":"ContainerDied","Data":"cbdee32da39609493a5fdab3853bbd124ef1afb74d05c58293494e87a42008c0"} Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.470185 4791 scope.go:117] "RemoveContainer" containerID="e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.469988 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.497348 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f212a215-55c7-48e3-a353-e0b74a390123" (UID: "f212a215-55c7-48e3-a353-e0b74a390123"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.504350 4791 scope.go:117] "RemoveContainer" containerID="f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.534133 4791 scope.go:117] "RemoveContainer" containerID="e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa" Feb 17 00:21:51 crc kubenswrapper[4791]: E0217 00:21:51.534628 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa\": container with ID starting with e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa not found: ID does not exist" containerID="e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.534681 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa"} err="failed to get container status \"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa\": rpc error: code = NotFound desc = could not find container \"e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa\": container with ID starting with e3afe8bc6f0100bd6fcf4687b5ffad8fff4ab447c4fb2f7b16b89ed9c6a182fa not found: ID does not exist" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.534706 4791 scope.go:117] "RemoveContainer" containerID="f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4" Feb 17 00:21:51 crc kubenswrapper[4791]: E0217 00:21:51.535045 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4\": container with ID starting with f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4 not found: ID does not exist" containerID="f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.535088 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4"} err="failed to get container status \"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4\": rpc error: code = NotFound desc = could not find container \"f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4\": container with ID starting with f9d209a659133287def75fe03f5efef7f796bd353ea14cd66a2fcbc9742847c4 not found: ID does not exist" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.559688 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.559717 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f212a215-55c7-48e3-a353-e0b74a390123-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.829381 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:51 crc kubenswrapper[4791]: I0217 00:21:51.838828 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.499024 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Feb 17 00:21:52 crc kubenswrapper[4791]: E0217 00:21:52.499372 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="manage-dockerfile" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.499395 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="manage-dockerfile" Feb 17 00:21:52 crc kubenswrapper[4791]: E0217 00:21:52.499417 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="docker-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.499430 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="docker-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.499619 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="f212a215-55c7-48e3-a353-e0b74a390123" containerName="docker-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.500801 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.502942 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.503179 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.503440 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.508525 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.540280 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.572534 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.572874 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kgz9\" (UniqueName: \"kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573000 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573158 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573305 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573413 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573505 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573621 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573723 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573826 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.573920 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.574020 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.675942 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676640 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676756 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676814 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676854 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676886 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.676970 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677024 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677075 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677164 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677211 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677264 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kgz9\" (UniqueName: \"kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.677301 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678008 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678067 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678247 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678482 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678566 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.678594 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.679236 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.679679 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.682226 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.684600 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.712393 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kgz9\" (UniqueName: \"kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9\") pod \"sg-core-2-build\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " pod="service-telemetry/sg-core-2-build" Feb 17 00:21:52 crc kubenswrapper[4791]: I0217 00:21:52.829245 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 17 00:21:53 crc kubenswrapper[4791]: I0217 00:21:53.235028 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f212a215-55c7-48e3-a353-e0b74a390123" path="/var/lib/kubelet/pods/f212a215-55c7-48e3-a353-e0b74a390123/volumes" Feb 17 00:21:53 crc kubenswrapper[4791]: I0217 00:21:53.309766 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Feb 17 00:21:53 crc kubenswrapper[4791]: I0217 00:21:53.491078 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerStarted","Data":"95153ff0c73c10588f012d75771a6f42fab1cc01a858b352d5f294ae0d3f9091"} Feb 17 00:21:54 crc kubenswrapper[4791]: I0217 00:21:54.502678 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerStarted","Data":"dc5c21e29377746db37307f253136c118eeb822bad13d9daca12b013030f9b79"} Feb 17 00:21:55 crc kubenswrapper[4791]: I0217 00:21:55.514158 4791 generic.go:334] "Generic (PLEG): container finished" podID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerID="dc5c21e29377746db37307f253136c118eeb822bad13d9daca12b013030f9b79" exitCode=0 Feb 17 00:21:55 crc kubenswrapper[4791]: I0217 00:21:55.514258 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerDied","Data":"dc5c21e29377746db37307f253136c118eeb822bad13d9daca12b013030f9b79"} Feb 17 00:21:56 crc kubenswrapper[4791]: I0217 00:21:56.520825 4791 generic.go:334] "Generic (PLEG): container finished" podID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerID="12bceec9da4fba16bd41634c80e6059d37799ddeaa3084515a4774bbe05b75ca" exitCode=0 Feb 17 00:21:56 crc kubenswrapper[4791]: I0217 00:21:56.520873 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerDied","Data":"12bceec9da4fba16bd41634c80e6059d37799ddeaa3084515a4774bbe05b75ca"} Feb 17 00:21:56 crc kubenswrapper[4791]: I0217 00:21:56.564515 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_eaaeddf5-72cd-46f1-b62e-83bf81db9dfa/manage-dockerfile/0.log" Feb 17 00:21:57 crc kubenswrapper[4791]: I0217 00:21:57.530394 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerStarted","Data":"a2bfd7a1bbc9055b744243dfb16479f7bfe71f03549db82ec9e06b1a48794e06"} Feb 17 00:21:57 crc kubenswrapper[4791]: I0217 00:21:57.564644 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.564618831 podStartE2EDuration="5.564618831s" podCreationTimestamp="2026-02-17 00:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:21:57.561274748 +0000 UTC m=+975.040787315" watchObservedRunningTime="2026-02-17 00:21:57.564618831 +0000 UTC m=+975.044131358" Feb 17 00:22:24 crc kubenswrapper[4791]: I0217 00:22:24.973548 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:22:24 crc kubenswrapper[4791]: I0217 00:22:24.974288 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:22:54 crc kubenswrapper[4791]: I0217 00:22:54.973585 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:22:54 crc kubenswrapper[4791]: I0217 00:22:54.975400 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:23:24 crc kubenswrapper[4791]: I0217 00:23:24.973177 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:23:24 crc kubenswrapper[4791]: I0217 00:23:24.973702 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:23:24 crc kubenswrapper[4791]: I0217 00:23:24.973753 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:23:24 crc kubenswrapper[4791]: I0217 00:23:24.974355 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:23:24 crc kubenswrapper[4791]: I0217 00:23:24.974409 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64" gracePeriod=600 Feb 17 00:23:25 crc kubenswrapper[4791]: I0217 00:23:25.159923 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64" exitCode=0 Feb 17 00:23:25 crc kubenswrapper[4791]: I0217 00:23:25.159996 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64"} Feb 17 00:23:25 crc kubenswrapper[4791]: I0217 00:23:25.160375 4791 scope.go:117] "RemoveContainer" containerID="25ad102ac8c402951283e13e1752712dd2cdbf609cf80aba767b2bf348b988eb" Feb 17 00:23:26 crc kubenswrapper[4791]: I0217 00:23:26.169762 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1"} Feb 17 00:25:09 crc kubenswrapper[4791]: I0217 00:25:09.902689 4791 generic.go:334] "Generic (PLEG): container finished" podID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerID="a2bfd7a1bbc9055b744243dfb16479f7bfe71f03549db82ec9e06b1a48794e06" exitCode=0 Feb 17 00:25:09 crc kubenswrapper[4791]: I0217 00:25:09.902757 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerDied","Data":"a2bfd7a1bbc9055b744243dfb16479f7bfe71f03549db82ec9e06b1a48794e06"} Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.266188 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347298 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347348 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347391 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347428 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347462 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347487 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347509 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347536 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347560 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347546 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347582 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kgz9\" (UniqueName: \"kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347699 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.347789 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push\") pod \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\" (UID: \"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa\") " Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.348481 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.348497 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.349175 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.349210 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.352872 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.354419 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.361574 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.361619 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9" (OuterVolumeSpecName: "kube-api-access-9kgz9") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "kube-api-access-9kgz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.361875 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.370294 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.451479 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.451898 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.452082 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.452316 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.452485 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.452658 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.452867 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kgz9\" (UniqueName: \"kubernetes.io/projected/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-kube-api-access-9kgz9\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.453058 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.453272 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.719484 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.758051 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.920941 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"eaaeddf5-72cd-46f1-b62e-83bf81db9dfa","Type":"ContainerDied","Data":"95153ff0c73c10588f012d75771a6f42fab1cc01a858b352d5f294ae0d3f9091"} Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.920996 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95153ff0c73c10588f012d75771a6f42fab1cc01a858b352d5f294ae0d3f9091" Feb 17 00:25:11 crc kubenswrapper[4791]: I0217 00:25:11.921045 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 17 00:25:14 crc kubenswrapper[4791]: I0217 00:25:14.640658 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" (UID: "eaaeddf5-72cd-46f1-b62e-83bf81db9dfa"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:14 crc kubenswrapper[4791]: I0217 00:25:14.710191 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/eaaeddf5-72cd-46f1-b62e-83bf81db9dfa-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.673437 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:15 crc kubenswrapper[4791]: E0217 00:25:15.673803 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="manage-dockerfile" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.673826 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="manage-dockerfile" Feb 17 00:25:15 crc kubenswrapper[4791]: E0217 00:25:15.673842 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="docker-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.673854 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="docker-build" Feb 17 00:25:15 crc kubenswrapper[4791]: E0217 00:25:15.673880 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="git-clone" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.673893 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="git-clone" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.674069 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaaeddf5-72cd-46f1-b62e-83bf81db9dfa" containerName="docker-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.675175 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.680513 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.681839 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.682807 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.685168 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.713965 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832091 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832187 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832224 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832444 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832539 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832721 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832785 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832831 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832879 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd9fg\" (UniqueName: \"kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.832959 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.833085 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.833289 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934327 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934375 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934408 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934430 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934449 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934481 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934518 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934588 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934672 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934618 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934814 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934840 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd9fg\" (UniqueName: \"kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934870 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.935304 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.934584 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.935568 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.935776 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.936167 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.936499 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.936878 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.937493 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.944941 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.946207 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:15 crc kubenswrapper[4791]: I0217 00:25:15.972766 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd9fg\" (UniqueName: \"kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg\") pod \"sg-bridge-1-build\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:16 crc kubenswrapper[4791]: I0217 00:25:16.007323 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:16 crc kubenswrapper[4791]: I0217 00:25:16.269035 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:16 crc kubenswrapper[4791]: I0217 00:25:16.965951 4791 generic.go:334] "Generic (PLEG): container finished" podID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerID="9e535bde2286bdb920242dd34aac2ed1fa15b9ef0478082f22083bc8a2746eb5" exitCode=0 Feb 17 00:25:16 crc kubenswrapper[4791]: I0217 00:25:16.966142 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5","Type":"ContainerDied","Data":"9e535bde2286bdb920242dd34aac2ed1fa15b9ef0478082f22083bc8a2746eb5"} Feb 17 00:25:16 crc kubenswrapper[4791]: I0217 00:25:16.966286 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5","Type":"ContainerStarted","Data":"a8ad63dc01e765b914019b189cc098bd7c13ee5b3d0bff5696bd4f8ebabdb868"} Feb 17 00:25:17 crc kubenswrapper[4791]: I0217 00:25:17.980645 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5","Type":"ContainerStarted","Data":"78c57b34e25a65d194222b72515febb984062deed4a8a18a2de643bedb2b7636"} Feb 17 00:25:18 crc kubenswrapper[4791]: I0217 00:25:18.021487 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.021452435 podStartE2EDuration="3.021452435s" podCreationTimestamp="2026-02-17 00:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:25:18.01031677 +0000 UTC m=+1175.489829307" watchObservedRunningTime="2026-02-17 00:25:18.021452435 +0000 UTC m=+1175.500965002" Feb 17 00:25:25 crc kubenswrapper[4791]: I0217 00:25:25.999023 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.000264 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="docker-build" containerID="cri-o://78c57b34e25a65d194222b72515febb984062deed4a8a18a2de643bedb2b7636" gracePeriod=30 Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.049200 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_a118b8d1-a620-44fd-9e07-8fbb0e78bcf5/docker-build/0.log" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.050101 4791 generic.go:334] "Generic (PLEG): container finished" podID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerID="78c57b34e25a65d194222b72515febb984062deed4a8a18a2de643bedb2b7636" exitCode=1 Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.050200 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5","Type":"ContainerDied","Data":"78c57b34e25a65d194222b72515febb984062deed4a8a18a2de643bedb2b7636"} Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.342529 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_a118b8d1-a620-44fd-9e07-8fbb0e78bcf5/docker-build/0.log" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.343227 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.485966 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486075 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486087 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486141 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486183 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486190 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486267 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486325 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486364 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486404 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486430 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486455 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd9fg\" (UniqueName: \"kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486491 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486521 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles\") pod \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\" (UID: \"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5\") " Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486974 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.486994 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.487027 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.488153 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.488487 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.488710 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.487623 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.492416 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.493260 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.494525 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg" (OuterVolumeSpecName: "kube-api-access-nd9fg") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "kube-api-access-nd9fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.567204 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588554 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588580 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588591 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588599 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588608 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588616 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588624 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd9fg\" (UniqueName: \"kubernetes.io/projected/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-kube-api-access-nd9fg\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588632 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:26 crc kubenswrapper[4791]: I0217 00:25:26.588640 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.046488 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" (UID: "a118b8d1-a620-44fd-9e07-8fbb0e78bcf5"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.059847 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_a118b8d1-a620-44fd-9e07-8fbb0e78bcf5/docker-build/0.log" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.060511 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"a118b8d1-a620-44fd-9e07-8fbb0e78bcf5","Type":"ContainerDied","Data":"a8ad63dc01e765b914019b189cc098bd7c13ee5b3d0bff5696bd4f8ebabdb868"} Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.060619 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.060634 4791 scope.go:117] "RemoveContainer" containerID="78c57b34e25a65d194222b72515febb984062deed4a8a18a2de643bedb2b7636" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.092999 4791 scope.go:117] "RemoveContainer" containerID="9e535bde2286bdb920242dd34aac2ed1fa15b9ef0478082f22083bc8a2746eb5" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.100741 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.117793 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.123582 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.235149 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" path="/var/lib/kubelet/pods/a118b8d1-a620-44fd-9e07-8fbb0e78bcf5/volumes" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.709453 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Feb 17 00:25:27 crc kubenswrapper[4791]: E0217 00:25:27.710342 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="manage-dockerfile" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.710457 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="manage-dockerfile" Feb 17 00:25:27 crc kubenswrapper[4791]: E0217 00:25:27.710580 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="docker-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.710657 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="docker-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.710849 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="a118b8d1-a620-44fd-9e07-8fbb0e78bcf5" containerName="docker-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.712151 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.714402 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.714438 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.714825 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.715077 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.749689 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.810949 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811041 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811205 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811266 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811303 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811336 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811383 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnh6\" (UniqueName: \"kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811418 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811503 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811546 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811587 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.811629 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.912892 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913233 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913288 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913330 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913397 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnh6\" (UniqueName: \"kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913399 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913448 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913508 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913553 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913609 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913670 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913728 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.913813 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.914387 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.914675 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.914794 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.914941 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.914963 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.915601 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.916090 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.916316 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.922605 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.922669 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:27 crc kubenswrapper[4791]: I0217 00:25:27.944356 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnh6\" (UniqueName: \"kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6\") pod \"sg-bridge-2-build\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:28 crc kubenswrapper[4791]: I0217 00:25:28.029866 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 17 00:25:28 crc kubenswrapper[4791]: I0217 00:25:28.297714 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Feb 17 00:25:29 crc kubenswrapper[4791]: I0217 00:25:29.090190 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerStarted","Data":"5ed6c7362353b66ccb65fbc96862f8cfe04c1a0062529c74c629bf54b700a947"} Feb 17 00:25:29 crc kubenswrapper[4791]: I0217 00:25:29.090253 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerStarted","Data":"959ba61d0afc9ccfd7a3013d85270690dccd549a55f01c5072900e4cab2d75f9"} Feb 17 00:25:30 crc kubenswrapper[4791]: I0217 00:25:30.099876 4791 generic.go:334] "Generic (PLEG): container finished" podID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerID="5ed6c7362353b66ccb65fbc96862f8cfe04c1a0062529c74c629bf54b700a947" exitCode=0 Feb 17 00:25:30 crc kubenswrapper[4791]: I0217 00:25:30.099952 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerDied","Data":"5ed6c7362353b66ccb65fbc96862f8cfe04c1a0062529c74c629bf54b700a947"} Feb 17 00:25:31 crc kubenswrapper[4791]: I0217 00:25:31.110902 4791 generic.go:334] "Generic (PLEG): container finished" podID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerID="54acd77816ddad2b49a1d79681a0cc4bd31839782e4e81051562044e5f869d33" exitCode=0 Feb 17 00:25:31 crc kubenswrapper[4791]: I0217 00:25:31.110977 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerDied","Data":"54acd77816ddad2b49a1d79681a0cc4bd31839782e4e81051562044e5f869d33"} Feb 17 00:25:31 crc kubenswrapper[4791]: I0217 00:25:31.171533 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_6809eb42-95c9-4cb7-b793-c4e855bd8f29/manage-dockerfile/0.log" Feb 17 00:25:32 crc kubenswrapper[4791]: I0217 00:25:32.122873 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerStarted","Data":"6dbd63f68048cb7fd58a16ab0b938520a22f44df21885c80133588df55d80571"} Feb 17 00:25:54 crc kubenswrapper[4791]: I0217 00:25:54.973583 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:25:54 crc kubenswrapper[4791]: I0217 00:25:54.974239 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:26:16 crc kubenswrapper[4791]: I0217 00:26:16.467161 4791 generic.go:334] "Generic (PLEG): container finished" podID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerID="6dbd63f68048cb7fd58a16ab0b938520a22f44df21885c80133588df55d80571" exitCode=0 Feb 17 00:26:16 crc kubenswrapper[4791]: I0217 00:26:16.467278 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerDied","Data":"6dbd63f68048cb7fd58a16ab0b938520a22f44df21885c80133588df55d80571"} Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.756923 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.844248 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.844343 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.844400 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.844427 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.844984 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845309 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845342 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845361 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjnh6\" (UniqueName: \"kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845381 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845432 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845456 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845496 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845519 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles\") pod \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\" (UID: \"6809eb42-95c9-4cb7-b793-c4e855bd8f29\") " Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845772 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.845377 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.846290 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.846337 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.846725 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.846915 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.847433 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.849707 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.850034 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.850176 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6" (OuterVolumeSpecName: "kube-api-access-rjnh6") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "kube-api-access-rjnh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946450 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946481 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946490 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946498 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946506 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946515 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/6809eb42-95c9-4cb7-b793-c4e855bd8f29-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946523 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6809eb42-95c9-4cb7-b793-c4e855bd8f29-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946532 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjnh6\" (UniqueName: \"kubernetes.io/projected/6809eb42-95c9-4cb7-b793-c4e855bd8f29-kube-api-access-rjnh6\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.946542 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:17 crc kubenswrapper[4791]: I0217 00:26:17.963597 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.047259 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.484805 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"6809eb42-95c9-4cb7-b793-c4e855bd8f29","Type":"ContainerDied","Data":"959ba61d0afc9ccfd7a3013d85270690dccd549a55f01c5072900e4cab2d75f9"} Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.484850 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="959ba61d0afc9ccfd7a3013d85270690dccd549a55f01c5072900e4cab2d75f9" Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.484855 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.513085 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6809eb42-95c9-4cb7-b793-c4e855bd8f29" (UID: "6809eb42-95c9-4cb7-b793-c4e855bd8f29"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:18 crc kubenswrapper[4791]: I0217 00:26:18.552955 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6809eb42-95c9-4cb7-b793-c4e855bd8f29-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.130737 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:22 crc kubenswrapper[4791]: E0217 00:26:22.131610 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="docker-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.131642 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="docker-build" Feb 17 00:26:22 crc kubenswrapper[4791]: E0217 00:26:22.131670 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="manage-dockerfile" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.131687 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="manage-dockerfile" Feb 17 00:26:22 crc kubenswrapper[4791]: E0217 00:26:22.131723 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="git-clone" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.131740 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="git-clone" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.132014 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="6809eb42-95c9-4cb7-b793-c4e855bd8f29" containerName="docker-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.133478 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.135665 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.135718 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.136143 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.136147 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.145748 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.303805 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.303899 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.303969 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675m6\" (UniqueName: \"kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304016 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304073 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304134 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304315 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304392 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304444 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304565 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304602 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.304662 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405465 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405533 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405575 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405617 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405647 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405699 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405734 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405786 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405815 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-675m6\" (UniqueName: \"kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405839 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405872 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405894 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.406073 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.405974 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.406276 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.406643 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.406689 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.407301 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.407388 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.407427 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.407910 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.413466 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.423349 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.447835 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-675m6\" (UniqueName: \"kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.449446 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:22 crc kubenswrapper[4791]: I0217 00:26:22.920169 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:23 crc kubenswrapper[4791]: I0217 00:26:23.523695 4791 generic.go:334] "Generic (PLEG): container finished" podID="4859167c-9cba-498e-85e0-25710c5c93ec" containerID="483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33" exitCode=0 Feb 17 00:26:23 crc kubenswrapper[4791]: I0217 00:26:23.523793 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4859167c-9cba-498e-85e0-25710c5c93ec","Type":"ContainerDied","Data":"483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33"} Feb 17 00:26:23 crc kubenswrapper[4791]: I0217 00:26:23.523944 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4859167c-9cba-498e-85e0-25710c5c93ec","Type":"ContainerStarted","Data":"4cf7ca9231f7176ed8c8ea9c4c5efff6bf42ad35a9176f33da4b1f486678be9c"} Feb 17 00:26:24 crc kubenswrapper[4791]: I0217 00:26:24.534587 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4859167c-9cba-498e-85e0-25710c5c93ec","Type":"ContainerStarted","Data":"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb"} Feb 17 00:26:24 crc kubenswrapper[4791]: I0217 00:26:24.571355 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=2.571334542 podStartE2EDuration="2.571334542s" podCreationTimestamp="2026-02-17 00:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:26:24.567492122 +0000 UTC m=+1242.047004659" watchObservedRunningTime="2026-02-17 00:26:24.571334542 +0000 UTC m=+1242.050847069" Feb 17 00:26:24 crc kubenswrapper[4791]: I0217 00:26:24.972826 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:26:24 crc kubenswrapper[4791]: I0217 00:26:24.972894 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:26:32 crc kubenswrapper[4791]: I0217 00:26:32.881731 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:32 crc kubenswrapper[4791]: I0217 00:26:32.882676 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="docker-build" containerID="cri-o://59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb" gracePeriod=30 Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.304808 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_4859167c-9cba-498e-85e0-25710c5c93ec/docker-build/0.log" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.305770 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356171 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356327 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356411 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356481 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356540 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356610 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356665 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356750 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356802 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356854 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356922 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-675m6\" (UniqueName: \"kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.356996 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run\") pod \"4859167c-9cba-498e-85e0-25710c5c93ec\" (UID: \"4859167c-9cba-498e-85e0-25710c5c93ec\") " Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357095 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357517 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357524 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357571 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357625 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.357664 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.359637 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.360443 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.363699 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.366090 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6" (OuterVolumeSpecName: "kube-api-access-675m6") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "kube-api-access-675m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.369270 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.453625 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459765 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459813 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459874 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4859167c-9cba-498e-85e0-25710c5c93ec-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459894 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459911 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459927 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-675m6\" (UniqueName: \"kubernetes.io/projected/4859167c-9cba-498e-85e0-25710c5c93ec-kube-api-access-675m6\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459944 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459960 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459977 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/4859167c-9cba-498e-85e0-25710c5c93ec-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.459993 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4859167c-9cba-498e-85e0-25710c5c93ec-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.606897 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_4859167c-9cba-498e-85e0-25710c5c93ec/docker-build/0.log" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.607767 4791 generic.go:334] "Generic (PLEG): container finished" podID="4859167c-9cba-498e-85e0-25710c5c93ec" containerID="59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb" exitCode=1 Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.607815 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4859167c-9cba-498e-85e0-25710c5c93ec","Type":"ContainerDied","Data":"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb"} Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.607853 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4859167c-9cba-498e-85e0-25710c5c93ec","Type":"ContainerDied","Data":"4cf7ca9231f7176ed8c8ea9c4c5efff6bf42ad35a9176f33da4b1f486678be9c"} Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.607922 4791 scope.go:117] "RemoveContainer" containerID="59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.608090 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.634594 4791 scope.go:117] "RemoveContainer" containerID="483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.669484 4791 scope.go:117] "RemoveContainer" containerID="59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb" Feb 17 00:26:33 crc kubenswrapper[4791]: E0217 00:26:33.670346 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb\": container with ID starting with 59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb not found: ID does not exist" containerID="59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.670419 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb"} err="failed to get container status \"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb\": rpc error: code = NotFound desc = could not find container \"59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb\": container with ID starting with 59c883a9d1d29f5835ff4c76c87e8cbadfa7b2819d1c1d2c0904004095bfc1cb not found: ID does not exist" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.670458 4791 scope.go:117] "RemoveContainer" containerID="483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33" Feb 17 00:26:33 crc kubenswrapper[4791]: E0217 00:26:33.670892 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33\": container with ID starting with 483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33 not found: ID does not exist" containerID="483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.670959 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33"} err="failed to get container status \"483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33\": rpc error: code = NotFound desc = could not find container \"483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33\": container with ID starting with 483a1bcca76e3db10784b97f52ee57ecba522e10db6fa16777a7e1634ccf2a33 not found: ID does not exist" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.824575 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "4859167c-9cba-498e-85e0-25710c5c93ec" (UID: "4859167c-9cba-498e-85e0-25710c5c93ec"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.866855 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4859167c-9cba-498e-85e0-25710c5c93ec-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.940732 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:33 crc kubenswrapper[4791]: I0217 00:26:33.947628 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.556127 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Feb 17 00:26:34 crc kubenswrapper[4791]: E0217 00:26:34.556413 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="manage-dockerfile" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.556432 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="manage-dockerfile" Feb 17 00:26:34 crc kubenswrapper[4791]: E0217 00:26:34.556444 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="docker-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.556450 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="docker-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.556553 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" containerName="docker-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.557333 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.559454 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.559962 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.560327 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-kmssz" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.560708 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.578724 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.578972 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579125 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579235 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t7tm\" (UniqueName: \"kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579374 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579494 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579601 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579720 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579815 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579932 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.579823 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.580022 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.580265 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681315 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681374 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681409 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681438 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t7tm\" (UniqueName: \"kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681464 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681494 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681522 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681559 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681589 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681628 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681648 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681671 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.681732 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682048 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682148 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682216 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682427 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682671 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682817 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682911 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.682912 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.689187 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.700858 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.707751 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t7tm\" (UniqueName: \"kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:34 crc kubenswrapper[4791]: I0217 00:26:34.870487 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:26:35 crc kubenswrapper[4791]: I0217 00:26:35.159261 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Feb 17 00:26:35 crc kubenswrapper[4791]: W0217 00:26:35.173876 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf17f317_5914_40b6_bfb7_12a157eb4b95.slice/crio-21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e WatchSource:0}: Error finding container 21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e: Status 404 returned error can't find the container with id 21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e Feb 17 00:26:35 crc kubenswrapper[4791]: I0217 00:26:35.248771 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4859167c-9cba-498e-85e0-25710c5c93ec" path="/var/lib/kubelet/pods/4859167c-9cba-498e-85e0-25710c5c93ec/volumes" Feb 17 00:26:35 crc kubenswrapper[4791]: I0217 00:26:35.626364 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerStarted","Data":"21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e"} Feb 17 00:26:36 crc kubenswrapper[4791]: I0217 00:26:36.634726 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerStarted","Data":"070a583a962a9fe7b0ef5baf99e88b5a7b72f4c99439f3c4e1e3e4d468484fff"} Feb 17 00:26:37 crc kubenswrapper[4791]: I0217 00:26:37.646669 4791 generic.go:334] "Generic (PLEG): container finished" podID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerID="070a583a962a9fe7b0ef5baf99e88b5a7b72f4c99439f3c4e1e3e4d468484fff" exitCode=0 Feb 17 00:26:37 crc kubenswrapper[4791]: I0217 00:26:37.647826 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerDied","Data":"070a583a962a9fe7b0ef5baf99e88b5a7b72f4c99439f3c4e1e3e4d468484fff"} Feb 17 00:26:38 crc kubenswrapper[4791]: I0217 00:26:38.657002 4791 generic.go:334] "Generic (PLEG): container finished" podID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerID="5770db58104d06079dbc8cd1063caafc530f7c4ce8a1ab7a39d501a3ecc96f8c" exitCode=0 Feb 17 00:26:38 crc kubenswrapper[4791]: I0217 00:26:38.657160 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerDied","Data":"5770db58104d06079dbc8cd1063caafc530f7c4ce8a1ab7a39d501a3ecc96f8c"} Feb 17 00:26:38 crc kubenswrapper[4791]: I0217 00:26:38.719123 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_df17f317-5914-40b6-bfb7-12a157eb4b95/manage-dockerfile/0.log" Feb 17 00:26:39 crc kubenswrapper[4791]: I0217 00:26:39.669525 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerStarted","Data":"67be1c6ded0357210ae6d498b2673cb5f728955a7c2e04e23f098517f94f7b7e"} Feb 17 00:26:39 crc kubenswrapper[4791]: I0217 00:26:39.749934 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=5.74986321 podStartE2EDuration="5.74986321s" podCreationTimestamp="2026-02-17 00:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:26:39.728231488 +0000 UTC m=+1257.207744025" watchObservedRunningTime="2026-02-17 00:26:39.74986321 +0000 UTC m=+1257.229375767" Feb 17 00:26:54 crc kubenswrapper[4791]: I0217 00:26:54.973455 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:26:54 crc kubenswrapper[4791]: I0217 00:26:54.974119 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:26:54 crc kubenswrapper[4791]: I0217 00:26:54.974168 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:26:54 crc kubenswrapper[4791]: I0217 00:26:54.974877 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:26:54 crc kubenswrapper[4791]: I0217 00:26:54.974930 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1" gracePeriod=600 Feb 17 00:26:55 crc kubenswrapper[4791]: I0217 00:26:55.780319 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1" exitCode=0 Feb 17 00:26:55 crc kubenswrapper[4791]: I0217 00:26:55.780437 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1"} Feb 17 00:26:55 crc kubenswrapper[4791]: I0217 00:26:55.780864 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57"} Feb 17 00:26:55 crc kubenswrapper[4791]: I0217 00:26:55.780909 4791 scope.go:117] "RemoveContainer" containerID="722b3f07ba9ee7144e67186af8cb544101e6caf0a3dcec49ef92e61b07ce2b64" Feb 17 00:27:35 crc kubenswrapper[4791]: I0217 00:27:35.051423 4791 generic.go:334] "Generic (PLEG): container finished" podID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerID="67be1c6ded0357210ae6d498b2673cb5f728955a7c2e04e23f098517f94f7b7e" exitCode=0 Feb 17 00:27:35 crc kubenswrapper[4791]: I0217 00:27:35.051526 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerDied","Data":"67be1c6ded0357210ae6d498b2673cb5f728955a7c2e04e23f098517f94f7b7e"} Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.311565 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403630 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403692 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403757 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t7tm\" (UniqueName: \"kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403785 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403807 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403833 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403854 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403880 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403895 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403914 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403930 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.403953 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets\") pod \"df17f317-5914-40b6-bfb7-12a157eb4b95\" (UID: \"df17f317-5914-40b6-bfb7-12a157eb4b95\") " Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.404166 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.404537 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.405217 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.405822 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.406513 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.407281 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.407430 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.416277 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm" (OuterVolumeSpecName: "kube-api-access-7t7tm") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "kube-api-access-7t7tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.416340 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push" (OuterVolumeSpecName: "builder-dockercfg-kmssz-push") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "builder-dockercfg-kmssz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.416412 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull" (OuterVolumeSpecName: "builder-dockercfg-kmssz-pull") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "builder-dockercfg-kmssz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505781 4791 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505827 4791 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505844 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t7tm\" (UniqueName: \"kubernetes.io/projected/df17f317-5914-40b6-bfb7-12a157eb4b95-kube-api-access-7t7tm\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505861 4791 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505878 4791 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505894 4791 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df17f317-5914-40b6-bfb7-12a157eb4b95-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505909 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505925 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-pull\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-pull\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505940 4791 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-kmssz-push\" (UniqueName: \"kubernetes.io/secret/df17f317-5914-40b6-bfb7-12a157eb4b95-builder-dockercfg-kmssz-push\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.505955 4791 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/df17f317-5914-40b6-bfb7-12a157eb4b95-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.547602 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:27:36 crc kubenswrapper[4791]: I0217 00:27:36.607788 4791 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:37 crc kubenswrapper[4791]: I0217 00:27:37.074203 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"df17f317-5914-40b6-bfb7-12a157eb4b95","Type":"ContainerDied","Data":"21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e"} Feb 17 00:27:37 crc kubenswrapper[4791]: I0217 00:27:37.074260 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21d543c6e21f9f761c2afa3bc899bece8caad13427e2538af83279dce190781e" Feb 17 00:27:37 crc kubenswrapper[4791]: I0217 00:27:37.074387 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 17 00:27:37 crc kubenswrapper[4791]: I0217 00:27:37.397676 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "df17f317-5914-40b6-bfb7-12a157eb4b95" (UID: "df17f317-5914-40b6-bfb7-12a157eb4b95"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:27:37 crc kubenswrapper[4791]: I0217 00:27:37.418144 4791 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/df17f317-5914-40b6-bfb7-12a157eb4b95-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.983753 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-6f787cb998-k5dw6"] Feb 17 00:27:41 crc kubenswrapper[4791]: E0217 00:27:41.984576 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="docker-build" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.984590 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="docker-build" Feb 17 00:27:41 crc kubenswrapper[4791]: E0217 00:27:41.984614 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="manage-dockerfile" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.984620 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="manage-dockerfile" Feb 17 00:27:41 crc kubenswrapper[4791]: E0217 00:27:41.984628 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="git-clone" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.984634 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="git-clone" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.984735 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="df17f317-5914-40b6-bfb7-12a157eb4b95" containerName="docker-build" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.985222 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:41 crc kubenswrapper[4791]: I0217 00:27:41.987173 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-p8dw4" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.000251 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-6f787cb998-k5dw6"] Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.074526 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/29c809ce-6a9b-4496-9c8e-8cd4506d926b-runner\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.074603 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmllp\" (UniqueName: \"kubernetes.io/projected/29c809ce-6a9b-4496-9c8e-8cd4506d926b-kube-api-access-hmllp\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.175331 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/29c809ce-6a9b-4496-9c8e-8cd4506d926b-runner\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.175396 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmllp\" (UniqueName: \"kubernetes.io/projected/29c809ce-6a9b-4496-9c8e-8cd4506d926b-kube-api-access-hmllp\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.175906 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/29c809ce-6a9b-4496-9c8e-8cd4506d926b-runner\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.196896 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmllp\" (UniqueName: \"kubernetes.io/projected/29c809ce-6a9b-4496-9c8e-8cd4506d926b-kube-api-access-hmllp\") pod \"smart-gateway-operator-6f787cb998-k5dw6\" (UID: \"29c809ce-6a9b-4496-9c8e-8cd4506d926b\") " pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.321412 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.512338 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-6f787cb998-k5dw6"] Feb 17 00:27:42 crc kubenswrapper[4791]: W0217 00:27:42.514577 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29c809ce_6a9b_4496_9c8e_8cd4506d926b.slice/crio-168a2747279fbb64ed777ab6ddd8cd204508eb72f7c55148d7923e2bf2c5b708 WatchSource:0}: Error finding container 168a2747279fbb64ed777ab6ddd8cd204508eb72f7c55148d7923e2bf2c5b708: Status 404 returned error can't find the container with id 168a2747279fbb64ed777ab6ddd8cd204508eb72f7c55148d7923e2bf2c5b708 Feb 17 00:27:42 crc kubenswrapper[4791]: I0217 00:27:42.517203 4791 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 00:27:43 crc kubenswrapper[4791]: I0217 00:27:43.115298 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" event={"ID":"29c809ce-6a9b-4496-9c8e-8cd4506d926b","Type":"ContainerStarted","Data":"168a2747279fbb64ed777ab6ddd8cd204508eb72f7c55148d7923e2bf2c5b708"} Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.257265 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7"] Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.258689 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.262266 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-hwbvw" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.275884 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7"] Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.358721 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-runner\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.358814 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvpgt\" (UniqueName: \"kubernetes.io/projected/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-kube-api-access-dvpgt\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.459576 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-runner\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.459848 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvpgt\" (UniqueName: \"kubernetes.io/projected/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-kube-api-access-dvpgt\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.460054 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-runner\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.479536 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvpgt\" (UniqueName: \"kubernetes.io/projected/c8e979be-fe5a-4d89-b1a6-0260fffdd27c-kube-api-access-dvpgt\") pod \"service-telemetry-operator-6f88b4fbc-h6xn7\" (UID: \"c8e979be-fe5a-4d89-b1a6-0260fffdd27c\") " pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:48 crc kubenswrapper[4791]: I0217 00:27:48.578871 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" Feb 17 00:27:52 crc kubenswrapper[4791]: I0217 00:27:52.912496 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7"] Feb 17 00:27:57 crc kubenswrapper[4791]: W0217 00:27:57.487042 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8e979be_fe5a_4d89_b1a6_0260fffdd27c.slice/crio-aeed2189cc9c95e95edc14791d1d728e87eee6d7fddce45f2a69ab31f3ad974c WatchSource:0}: Error finding container aeed2189cc9c95e95edc14791d1d728e87eee6d7fddce45f2a69ab31f3ad974c: Status 404 returned error can't find the container with id aeed2189cc9c95e95edc14791d1d728e87eee6d7fddce45f2a69ab31f3ad974c Feb 17 00:27:58 crc kubenswrapper[4791]: I0217 00:27:58.221501 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" event={"ID":"c8e979be-fe5a-4d89-b1a6-0260fffdd27c","Type":"ContainerStarted","Data":"aeed2189cc9c95e95edc14791d1d728e87eee6d7fddce45f2a69ab31f3ad974c"} Feb 17 00:27:59 crc kubenswrapper[4791]: E0217 00:27:59.065614 4791 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:latest" Feb 17 00:27:59 crc kubenswrapper[4791]: E0217 00:27:59.065839 4791 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1771288058,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hmllp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-6f787cb998-k5dw6_service-telemetry(29c809ce-6a9b-4496-9c8e-8cd4506d926b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 00:27:59 crc kubenswrapper[4791]: E0217 00:27:59.067207 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" podUID="29c809ce-6a9b-4496-9c8e-8cd4506d926b" Feb 17 00:27:59 crc kubenswrapper[4791]: E0217 00:27:59.229556 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:latest\\\"\"" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" podUID="29c809ce-6a9b-4496-9c8e-8cd4506d926b" Feb 17 00:28:04 crc kubenswrapper[4791]: I0217 00:28:04.264928 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" event={"ID":"c8e979be-fe5a-4d89-b1a6-0260fffdd27c","Type":"ContainerStarted","Data":"dcc4b3416ec3b357c3c1d368d158eb7719bdb7e72028515c225f4b5839c54a14"} Feb 17 00:28:04 crc kubenswrapper[4791]: I0217 00:28:04.302943 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-6f88b4fbc-h6xn7" podStartSLOduration=10.209074012 podStartE2EDuration="16.302914872s" podCreationTimestamp="2026-02-17 00:27:48 +0000 UTC" firstStartedPulling="2026-02-17 00:27:57.492655625 +0000 UTC m=+1334.972168152" lastFinishedPulling="2026-02-17 00:28:03.586496445 +0000 UTC m=+1341.066009012" observedRunningTime="2026-02-17 00:28:04.28962646 +0000 UTC m=+1341.769139027" watchObservedRunningTime="2026-02-17 00:28:04.302914872 +0000 UTC m=+1341.782427439" Feb 17 00:28:13 crc kubenswrapper[4791]: I0217 00:28:13.334708 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" event={"ID":"29c809ce-6a9b-4496-9c8e-8cd4506d926b","Type":"ContainerStarted","Data":"15fc434018528a24e3e0c2d2ad32c5b44b2835eea1569d6d4d295be2c2eb389b"} Feb 17 00:28:13 crc kubenswrapper[4791]: I0217 00:28:13.354476 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-6f787cb998-k5dw6" podStartSLOduration=1.977734094 podStartE2EDuration="32.354457311s" podCreationTimestamp="2026-02-17 00:27:41 +0000 UTC" firstStartedPulling="2026-02-17 00:27:42.516913923 +0000 UTC m=+1319.996426450" lastFinishedPulling="2026-02-17 00:28:12.8936371 +0000 UTC m=+1350.373149667" observedRunningTime="2026-02-17 00:28:13.352537922 +0000 UTC m=+1350.832050489" watchObservedRunningTime="2026-02-17 00:28:13.354457311 +0000 UTC m=+1350.833969858" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.455524 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.456927 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.459606 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.459790 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.460047 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.460743 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.460872 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.461641 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-c5cq6" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.461912 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.484908 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.518760 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.518837 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.518923 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt7d8\" (UniqueName: \"kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.518966 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.519024 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.519067 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.519127 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620660 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620725 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620761 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620800 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt7d8\" (UniqueName: \"kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620833 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620871 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.620902 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.621657 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.627481 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.627579 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.627662 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.628305 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.641508 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.649238 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt7d8\" (UniqueName: \"kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8\") pod \"default-interconnect-68864d46cb-7q5s9\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:26 crc kubenswrapper[4791]: I0217 00:28:26.791358 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:28:27 crc kubenswrapper[4791]: I0217 00:28:27.219554 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:28:27 crc kubenswrapper[4791]: I0217 00:28:27.422783 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" event={"ID":"99be99ec-fe95-419c-ba3f-b4e3601e433a","Type":"ContainerStarted","Data":"b23f71e1251fc3bfc8c2584134054c100e44edda849e4b8dcd9907f81f1b91b4"} Feb 17 00:28:32 crc kubenswrapper[4791]: I0217 00:28:32.460000 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" event={"ID":"99be99ec-fe95-419c-ba3f-b4e3601e433a","Type":"ContainerStarted","Data":"5c0b32a599acff163bd3b75453552de8cfb3a2023cfaf54423af8eea541e17ba"} Feb 17 00:28:32 crc kubenswrapper[4791]: I0217 00:28:32.483937 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" podStartSLOduration=1.87580377 podStartE2EDuration="6.483917338s" podCreationTimestamp="2026-02-17 00:28:26 +0000 UTC" firstStartedPulling="2026-02-17 00:28:27.22756315 +0000 UTC m=+1364.707075677" lastFinishedPulling="2026-02-17 00:28:31.835676718 +0000 UTC m=+1369.315189245" observedRunningTime="2026-02-17 00:28:32.480205262 +0000 UTC m=+1369.959717799" watchObservedRunningTime="2026-02-17 00:28:32.483917338 +0000 UTC m=+1369.963429865" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.613376 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.615423 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.619594 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.619680 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.619889 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.619950 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.620035 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.620160 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.619614 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-7fpb8" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.620299 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.620500 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.620736 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.636306 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.667993 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668280 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-tls-assets\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668373 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668466 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668589 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668689 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrbh\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-kube-api-access-mvrbh\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668798 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b356babb-a624-4180-bc64-21d7c7e19a71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b356babb-a624-4180-bc64-21d7c7e19a71\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668924 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config-out\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.668998 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.669232 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-web-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.669395 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.669537 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.771356 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.771449 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.771591 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.771652 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrbh\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-kube-api-access-mvrbh\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.771735 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b356babb-a624-4180-bc64-21d7c7e19a71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b356babb-a624-4180-bc64-21d7c7e19a71\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.772496 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.772525 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.772155 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773222 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config-out\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773286 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-web-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773363 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773423 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773473 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.773504 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-tls-assets\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: E0217 00:28:36.773653 4791 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Feb 17 00:28:36 crc kubenswrapper[4791]: E0217 00:28:36.773714 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls podName:af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9 nodeName:}" failed. No retries permitted until 2026-02-17 00:28:37.273698915 +0000 UTC m=+1374.753211532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9") : secret "default-prometheus-proxy-tls" not found Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.774240 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.774473 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.779437 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.780484 4791 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.780523 4791 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b356babb-a624-4180-bc64-21d7c7e19a71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b356babb-a624-4180-bc64-21d7c7e19a71\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/15b97e0c19fd885ae252fc1669bf27c70a61564e4d934a06044237c0a873e999/globalmount\"" pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.788591 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-web-config\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.791146 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-tls-assets\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.791270 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-config-out\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.792345 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.796852 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrbh\" (UniqueName: \"kubernetes.io/projected/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-kube-api-access-mvrbh\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:36 crc kubenswrapper[4791]: I0217 00:28:36.802421 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b356babb-a624-4180-bc64-21d7c7e19a71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b356babb-a624-4180-bc64-21d7c7e19a71\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:37 crc kubenswrapper[4791]: I0217 00:28:37.279529 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:37 crc kubenswrapper[4791]: E0217 00:28:37.279750 4791 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Feb 17 00:28:37 crc kubenswrapper[4791]: E0217 00:28:37.279810 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls podName:af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9 nodeName:}" failed. No retries permitted until 2026-02-17 00:28:38.279789598 +0000 UTC m=+1375.759302125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9") : secret "default-prometheus-proxy-tls" not found Feb 17 00:28:38 crc kubenswrapper[4791]: I0217 00:28:38.294503 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:38 crc kubenswrapper[4791]: I0217 00:28:38.302179 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9\") " pod="service-telemetry/prometheus-default-0" Feb 17 00:28:38 crc kubenswrapper[4791]: I0217 00:28:38.439240 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Feb 17 00:28:38 crc kubenswrapper[4791]: I0217 00:28:38.898066 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 17 00:28:39 crc kubenswrapper[4791]: I0217 00:28:39.511039 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerStarted","Data":"3a7d02d6cf2e9460b02c43ab7b535613692b3989086ccd32bff5782a33504cd8"} Feb 17 00:28:44 crc kubenswrapper[4791]: I0217 00:28:44.544362 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerStarted","Data":"800a003e4a34de46b54655e05480a4349c1d5a3d6b4843b985ca73d32a00cac5"} Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.517411 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-z8m2z"] Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.518643 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.542960 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-z8m2z"] Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.612403 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj5wt\" (UniqueName: \"kubernetes.io/projected/135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd-kube-api-access-cj5wt\") pod \"default-snmp-webhook-6856cfb745-z8m2z\" (UID: \"135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.713779 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj5wt\" (UniqueName: \"kubernetes.io/projected/135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd-kube-api-access-cj5wt\") pod \"default-snmp-webhook-6856cfb745-z8m2z\" (UID: \"135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.738700 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj5wt\" (UniqueName: \"kubernetes.io/projected/135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd-kube-api-access-cj5wt\") pod \"default-snmp-webhook-6856cfb745-z8m2z\" (UID: \"135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" Feb 17 00:28:46 crc kubenswrapper[4791]: I0217 00:28:46.851904 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" Feb 17 00:28:47 crc kubenswrapper[4791]: I0217 00:28:47.085963 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-z8m2z"] Feb 17 00:28:47 crc kubenswrapper[4791]: W0217 00:28:47.087720 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod135a7c9c_cdfa_4baa_a4b3_ea9f6392d1cd.slice/crio-3f8026074708c1af4cdec352e7a4e4d7742820b010c878e8c08816f06add69fa WatchSource:0}: Error finding container 3f8026074708c1af4cdec352e7a4e4d7742820b010c878e8c08816f06add69fa: Status 404 returned error can't find the container with id 3f8026074708c1af4cdec352e7a4e4d7742820b010c878e8c08816f06add69fa Feb 17 00:28:47 crc kubenswrapper[4791]: I0217 00:28:47.567677 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" event={"ID":"135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd","Type":"ContainerStarted","Data":"3f8026074708c1af4cdec352e7a4e4d7742820b010c878e8c08816f06add69fa"} Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.296372 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.298452 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.306059 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.306437 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.306629 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.306792 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-9svbl" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.310449 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.310627 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.311667 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.379850 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-volume\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.379892 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-out\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.379930 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-web-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.379989 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.380010 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.380033 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.380061 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.380144 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.380175 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dlzf\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-kube-api-access-9dlzf\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.480841 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481061 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dlzf\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-kube-api-access-9dlzf\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481199 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-volume\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481275 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-out\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481364 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-web-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481443 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481517 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481596 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.481664 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: E0217 00:28:50.481840 4791 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:50 crc kubenswrapper[4791]: E0217 00:28:50.481938 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls podName:df247d19-621c-4c9b-a436-d4f263dcb5ae nodeName:}" failed. No retries permitted until 2026-02-17 00:28:50.981921939 +0000 UTC m=+1388.461434466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "df247d19-621c-4c9b-a436-d4f263dcb5ae") : secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.494723 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.494964 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-out\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.495435 4791 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.495486 4791 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/51638830e311897aa0d4241a4b7178f92cff69c8608a060b8511b181cc9935b1/globalmount\"" pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.500713 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.501417 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-config-volume\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.502789 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.507092 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-web-config\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.521244 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dlzf\" (UniqueName: \"kubernetes.io/projected/df247d19-621c-4c9b-a436-d4f263dcb5ae-kube-api-access-9dlzf\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.577941 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e329ac4-40a0-4d17-b294-c3e266564e22\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: I0217 00:28:50.990350 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:50 crc kubenswrapper[4791]: E0217 00:28:50.990611 4791 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:50 crc kubenswrapper[4791]: E0217 00:28:50.991280 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls podName:df247d19-621c-4c9b-a436-d4f263dcb5ae nodeName:}" failed. No retries permitted until 2026-02-17 00:28:51.991246273 +0000 UTC m=+1389.470758830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "df247d19-621c-4c9b-a436-d4f263dcb5ae") : secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:51 crc kubenswrapper[4791]: I0217 00:28:51.596947 4791 generic.go:334] "Generic (PLEG): container finished" podID="af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9" containerID="800a003e4a34de46b54655e05480a4349c1d5a3d6b4843b985ca73d32a00cac5" exitCode=0 Feb 17 00:28:51 crc kubenswrapper[4791]: I0217 00:28:51.596987 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerDied","Data":"800a003e4a34de46b54655e05480a4349c1d5a3d6b4843b985ca73d32a00cac5"} Feb 17 00:28:52 crc kubenswrapper[4791]: I0217 00:28:52.010555 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:52 crc kubenswrapper[4791]: E0217 00:28:52.010827 4791 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:52 crc kubenswrapper[4791]: E0217 00:28:52.010908 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls podName:df247d19-621c-4c9b-a436-d4f263dcb5ae nodeName:}" failed. No retries permitted until 2026-02-17 00:28:54.010886712 +0000 UTC m=+1391.490399239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "df247d19-621c-4c9b-a436-d4f263dcb5ae") : secret "default-alertmanager-proxy-tls" not found Feb 17 00:28:54 crc kubenswrapper[4791]: I0217 00:28:54.034808 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:54 crc kubenswrapper[4791]: I0217 00:28:54.040968 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/df247d19-621c-4c9b-a436-d4f263dcb5ae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"df247d19-621c-4c9b-a436-d4f263dcb5ae\") " pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:54 crc kubenswrapper[4791]: I0217 00:28:54.223217 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Feb 17 00:28:55 crc kubenswrapper[4791]: I0217 00:28:55.979894 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 17 00:28:56 crc kubenswrapper[4791]: I0217 00:28:56.643602 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerStarted","Data":"9a8658cf5d59a0c49e5ceb447e7880e0e70a7fd978c3817b4a367f2f94def0e4"} Feb 17 00:28:56 crc kubenswrapper[4791]: I0217 00:28:56.645064 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" event={"ID":"135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd","Type":"ContainerStarted","Data":"ba9c3932a458788ed268f59e278169101bc6fe2285f0cd5930490be0912c7ef9"} Feb 17 00:28:56 crc kubenswrapper[4791]: I0217 00:28:56.668624 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-z8m2z" podStartSLOduration=2.135212671 podStartE2EDuration="10.66860632s" podCreationTimestamp="2026-02-17 00:28:46 +0000 UTC" firstStartedPulling="2026-02-17 00:28:47.09007319 +0000 UTC m=+1384.569585727" lastFinishedPulling="2026-02-17 00:28:55.623466849 +0000 UTC m=+1393.102979376" observedRunningTime="2026-02-17 00:28:56.665496483 +0000 UTC m=+1394.145009020" watchObservedRunningTime="2026-02-17 00:28:56.66860632 +0000 UTC m=+1394.148118847" Feb 17 00:28:58 crc kubenswrapper[4791]: I0217 00:28:58.658531 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerStarted","Data":"998ef5a14e6e2a31346916c64f98b39fb611f2c22ab183eb500881c5f701baa9"} Feb 17 00:29:00 crc kubenswrapper[4791]: I0217 00:29:00.677582 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerStarted","Data":"3753dc97d2d9a9f9c377ac877c2152f4b3ae9f33fdfea70a5940be95ade659c6"} Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.599636 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8"] Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.602572 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.607701 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.607718 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.607767 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.616405 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-dwsz6" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.626765 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8"] Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.690954 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d7ac416-17e0-4f86-8786-0afdec7fc240-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.691073 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/0d7ac416-17e0-4f86-8786-0afdec7fc240-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.691152 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.691388 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ksqx\" (UniqueName: \"kubernetes.io/projected/0d7ac416-17e0-4f86-8786-0afdec7fc240-kube-api-access-6ksqx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.691424 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.715851 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerStarted","Data":"e4daa4b3e8564fa8acd4bcf94b8ae100cc65c2fd1c2aaded03771f6a80791e1b"} Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.724242 4791 generic.go:334] "Generic (PLEG): container finished" podID="df247d19-621c-4c9b-a436-d4f263dcb5ae" containerID="998ef5a14e6e2a31346916c64f98b39fb611f2c22ab183eb500881c5f701baa9" exitCode=0 Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.724308 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerDied","Data":"998ef5a14e6e2a31346916c64f98b39fb611f2c22ab183eb500881c5f701baa9"} Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.793300 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ksqx\" (UniqueName: \"kubernetes.io/projected/0d7ac416-17e0-4f86-8786-0afdec7fc240-kube-api-access-6ksqx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.793370 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.793433 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d7ac416-17e0-4f86-8786-0afdec7fc240-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.793460 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/0d7ac416-17e0-4f86-8786-0afdec7fc240-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.793531 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: E0217 00:29:03.794784 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Feb 17 00:29:03 crc kubenswrapper[4791]: E0217 00:29:03.794851 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls podName:0d7ac416-17e0-4f86-8786-0afdec7fc240 nodeName:}" failed. No retries permitted until 2026-02-17 00:29:04.294828872 +0000 UTC m=+1401.774341399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" (UID: "0d7ac416-17e0-4f86-8786-0afdec7fc240") : secret "default-cloud1-coll-meter-proxy-tls" not found Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.795735 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d7ac416-17e0-4f86-8786-0afdec7fc240-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.796411 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/0d7ac416-17e0-4f86-8786-0afdec7fc240-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.820452 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:03 crc kubenswrapper[4791]: I0217 00:29:03.821672 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ksqx\" (UniqueName: \"kubernetes.io/projected/0d7ac416-17e0-4f86-8786-0afdec7fc240-kube-api-access-6ksqx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:04 crc kubenswrapper[4791]: I0217 00:29:04.300078 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:04 crc kubenswrapper[4791]: E0217 00:29:04.300254 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Feb 17 00:29:04 crc kubenswrapper[4791]: E0217 00:29:04.300346 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls podName:0d7ac416-17e0-4f86-8786-0afdec7fc240 nodeName:}" failed. No retries permitted until 2026-02-17 00:29:05.300324157 +0000 UTC m=+1402.779836684 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" (UID: "0d7ac416-17e0-4f86-8786-0afdec7fc240") : secret "default-cloud1-coll-meter-proxy-tls" not found Feb 17 00:29:05 crc kubenswrapper[4791]: I0217 00:29:05.316063 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:05 crc kubenswrapper[4791]: I0217 00:29:05.329233 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7ac416-17e0-4f86-8786-0afdec7fc240-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8\" (UID: \"0d7ac416-17e0-4f86-8786-0afdec7fc240\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:05 crc kubenswrapper[4791]: I0217 00:29:05.423279 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.232925 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9"] Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.234231 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.236956 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.236954 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.245209 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9"] Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.430648 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.430764 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.430811 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.430843 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865kq\" (UniqueName: \"kubernetes.io/projected/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-kube-api-access-865kq\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.430953 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.532711 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.532773 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.532797 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.532819 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-865kq\" (UniqueName: \"kubernetes.io/projected/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-kube-api-access-865kq\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.532850 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: E0217 00:29:06.532886 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 17 00:29:06 crc kubenswrapper[4791]: E0217 00:29:06.532957 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls podName:e3d9a725-2f9c-4fcc-8610-4b297a3d689d nodeName:}" failed. No retries permitted until 2026-02-17 00:29:07.032940021 +0000 UTC m=+1404.512452538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" (UID: "e3d9a725-2f9c-4fcc-8610-4b297a3d689d") : secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.533902 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.535057 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.537248 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:06 crc kubenswrapper[4791]: I0217 00:29:06.561571 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-865kq\" (UniqueName: \"kubernetes.io/projected/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-kube-api-access-865kq\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:07 crc kubenswrapper[4791]: I0217 00:29:07.039182 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:07 crc kubenswrapper[4791]: E0217 00:29:07.039289 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 17 00:29:07 crc kubenswrapper[4791]: E0217 00:29:07.039392 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls podName:e3d9a725-2f9c-4fcc-8610-4b297a3d689d nodeName:}" failed. No retries permitted until 2026-02-17 00:29:08.039373945 +0000 UTC m=+1405.518886472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" (UID: "e3d9a725-2f9c-4fcc-8610-4b297a3d689d") : secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 17 00:29:08 crc kubenswrapper[4791]: I0217 00:29:08.050679 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:08 crc kubenswrapper[4791]: I0217 00:29:08.059023 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3d9a725-2f9c-4fcc-8610-4b297a3d689d-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9\" (UID: \"e3d9a725-2f9c-4fcc-8610-4b297a3d689d\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:08 crc kubenswrapper[4791]: I0217 00:29:08.356161 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.443348 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8"] Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.946289 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p"] Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.949223 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.951323 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.951853 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Feb 17 00:29:09 crc kubenswrapper[4791]: I0217 00:29:09.965271 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p"] Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.084577 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.084631 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.084681 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.084708 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.084730 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh2wc\" (UniqueName: \"kubernetes.io/projected/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-kube-api-access-nh2wc\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.186309 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.186360 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.186395 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh2wc\" (UniqueName: \"kubernetes.io/projected/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-kube-api-access-nh2wc\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.186477 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.186506 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: E0217 00:29:10.186562 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Feb 17 00:29:10 crc kubenswrapper[4791]: E0217 00:29:10.186932 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls podName:6e985bca-5f71-47e0-bd63-ede2ad79bd7e nodeName:}" failed. No retries permitted until 2026-02-17 00:29:10.686912044 +0000 UTC m=+1408.166424561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" (UID: "6e985bca-5f71-47e0-bd63-ede2ad79bd7e") : secret "default-cloud1-sens-meter-proxy-tls" not found Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.187896 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.188249 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.193876 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.222872 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh2wc\" (UniqueName: \"kubernetes.io/projected/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-kube-api-access-nh2wc\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: I0217 00:29:10.694366 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:10 crc kubenswrapper[4791]: E0217 00:29:10.694577 4791 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Feb 17 00:29:10 crc kubenswrapper[4791]: E0217 00:29:10.694878 4791 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls podName:6e985bca-5f71-47e0-bd63-ede2ad79bd7e nodeName:}" failed. No retries permitted until 2026-02-17 00:29:11.694859906 +0000 UTC m=+1409.174372433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" (UID: "6e985bca-5f71-47e0-bd63-ede2ad79bd7e") : secret "default-cloud1-sens-meter-proxy-tls" not found Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.279732 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9"] Feb 17 00:29:11 crc kubenswrapper[4791]: W0217 00:29:11.293292 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3d9a725_2f9c_4fcc_8610_4b297a3d689d.slice/crio-607da8697ab8119c957c44b118ef4d1b97a48836c0ae10a133011a6fb7b565d5 WatchSource:0}: Error finding container 607da8697ab8119c957c44b118ef4d1b97a48836c0ae10a133011a6fb7b565d5: Status 404 returned error can't find the container with id 607da8697ab8119c957c44b118ef4d1b97a48836c0ae10a133011a6fb7b565d5 Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.714168 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.724828 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e985bca-5f71-47e0-bd63-ede2ad79bd7e-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p\" (UID: \"6e985bca-5f71-47e0-bd63-ede2ad79bd7e\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.765672 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.786225 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9","Type":"ContainerStarted","Data":"37c810ec5bee67125f7768518146935329d76039df0ea1422f1a62c24a4c0161"} Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.787226 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"2ea0b480a8247883d24e5aded33de0fdcbf8948ea1f81f572b1d93ba31c2ef4b"} Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.789505 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerStarted","Data":"84e468c8b653dc5b490fb7c9a38b4cd72c4b7f05c8279923819b2ccb0540039c"} Feb 17 00:29:11 crc kubenswrapper[4791]: I0217 00:29:11.790956 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"607da8697ab8119c957c44b118ef4d1b97a48836c0ae10a133011a6fb7b565d5"} Feb 17 00:29:12 crc kubenswrapper[4791]: I0217 00:29:12.242196 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=5.127527702 podStartE2EDuration="37.242176088s" podCreationTimestamp="2026-02-17 00:28:35 +0000 UTC" firstStartedPulling="2026-02-17 00:28:38.899955134 +0000 UTC m=+1376.379467661" lastFinishedPulling="2026-02-17 00:29:11.01460352 +0000 UTC m=+1408.494116047" observedRunningTime="2026-02-17 00:29:11.824917255 +0000 UTC m=+1409.304429782" watchObservedRunningTime="2026-02-17 00:29:12.242176088 +0000 UTC m=+1409.721688635" Feb 17 00:29:12 crc kubenswrapper[4791]: I0217 00:29:12.245867 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p"] Feb 17 00:29:12 crc kubenswrapper[4791]: W0217 00:29:12.277386 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e985bca_5f71_47e0_bd63_ede2ad79bd7e.slice/crio-faf1194849a6b08f2342300cdeb5f6df3eb6d76cc07d1b36f2d2fcff087e07be WatchSource:0}: Error finding container faf1194849a6b08f2342300cdeb5f6df3eb6d76cc07d1b36f2d2fcff087e07be: Status 404 returned error can't find the container with id faf1194849a6b08f2342300cdeb5f6df3eb6d76cc07d1b36f2d2fcff087e07be Feb 17 00:29:12 crc kubenswrapper[4791]: I0217 00:29:12.797976 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"faf1194849a6b08f2342300cdeb5f6df3eb6d76cc07d1b36f2d2fcff087e07be"} Feb 17 00:29:12 crc kubenswrapper[4791]: I0217 00:29:12.800123 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"0bcd8fd838c425b802a4819261a744dba4ce449654ec0a0fe6ae473c08cdeba1"} Feb 17 00:29:13 crc kubenswrapper[4791]: I0217 00:29:13.439991 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Feb 17 00:29:13 crc kubenswrapper[4791]: I0217 00:29:13.807596 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"3d3e6e5f6fae0a1d9bc95cd8d0ec2bdd51919a431ec36da1b9ad69643dd5bb68"} Feb 17 00:29:13 crc kubenswrapper[4791]: I0217 00:29:13.810258 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerStarted","Data":"153265f174b4a9d401be05ef4b3d5074ecb8cffeda6cb6fb5a3745907cbc94f9"} Feb 17 00:29:16 crc kubenswrapper[4791]: I0217 00:29:16.833342 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"06984239a9a3196958fafabce1efbb7339596d15d505f88825db83923ee3bd54"} Feb 17 00:29:16 crc kubenswrapper[4791]: I0217 00:29:16.836049 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"df247d19-621c-4c9b-a436-d4f263dcb5ae","Type":"ContainerStarted","Data":"b1e6bba6a1270fae1a1dedb13487801a2467c60de6bdb21dae1122ce5b716e1d"} Feb 17 00:29:16 crc kubenswrapper[4791]: I0217 00:29:16.859349 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=15.407657542 podStartE2EDuration="27.859333717s" podCreationTimestamp="2026-02-17 00:28:49 +0000 UTC" firstStartedPulling="2026-02-17 00:29:03.730302547 +0000 UTC m=+1401.209815074" lastFinishedPulling="2026-02-17 00:29:16.181978722 +0000 UTC m=+1413.661491249" observedRunningTime="2026-02-17 00:29:16.857095787 +0000 UTC m=+1414.336608314" watchObservedRunningTime="2026-02-17 00:29:16.859333717 +0000 UTC m=+1414.338846244" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.564608 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk"] Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.565690 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.569260 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.569392 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.584164 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk"] Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.701042 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4634db00-8e3c-4569-b66c-ef549eda9204-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.701105 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rs6n\" (UniqueName: \"kubernetes.io/projected/4634db00-8e3c-4569-b66c-ef549eda9204-kube-api-access-6rs6n\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.701462 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/4634db00-8e3c-4569-b66c-ef549eda9204-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.701632 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4634db00-8e3c-4569-b66c-ef549eda9204-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.803506 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4634db00-8e3c-4569-b66c-ef549eda9204-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.803903 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4634db00-8e3c-4569-b66c-ef549eda9204-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.803938 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rs6n\" (UniqueName: \"kubernetes.io/projected/4634db00-8e3c-4569-b66c-ef549eda9204-kube-api-access-6rs6n\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.804031 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/4634db00-8e3c-4569-b66c-ef549eda9204-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.804600 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4634db00-8e3c-4569-b66c-ef549eda9204-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.804684 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4634db00-8e3c-4569-b66c-ef549eda9204-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.826046 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/4634db00-8e3c-4569-b66c-ef549eda9204-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.826773 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rs6n\" (UniqueName: \"kubernetes.io/projected/4634db00-8e3c-4569-b66c-ef549eda9204-kube-api-access-6rs6n\") pod \"default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk\" (UID: \"4634db00-8e3c-4569-b66c-ef549eda9204\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.844054 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"74d069a0a7b95b2c13b5d7774c37eff82d25f60a2056b32e4a8abff250caf0b5"} Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.847210 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"ac04f34d07519fe9dd3a66842e1d7ce4acf0bcb7a4067dd4c43b6fa708598d8c"} Feb 17 00:29:17 crc kubenswrapper[4791]: I0217 00:29:17.887135 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.372973 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk"] Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.503853 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m"] Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.505039 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.507537 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.521146 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m"] Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.616841 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/21f15fa5-1f89-4aae-b6df-6a7c33630f43-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.616903 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/21f15fa5-1f89-4aae-b6df-6a7c33630f43-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.616920 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/21f15fa5-1f89-4aae-b6df-6a7c33630f43-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.617028 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nppcx\" (UniqueName: \"kubernetes.io/projected/21f15fa5-1f89-4aae-b6df-6a7c33630f43-kube-api-access-nppcx\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.721709 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nppcx\" (UniqueName: \"kubernetes.io/projected/21f15fa5-1f89-4aae-b6df-6a7c33630f43-kube-api-access-nppcx\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.721842 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/21f15fa5-1f89-4aae-b6df-6a7c33630f43-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.721888 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/21f15fa5-1f89-4aae-b6df-6a7c33630f43-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.721940 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/21f15fa5-1f89-4aae-b6df-6a7c33630f43-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.723783 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/21f15fa5-1f89-4aae-b6df-6a7c33630f43-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.724298 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/21f15fa5-1f89-4aae-b6df-6a7c33630f43-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.728809 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/21f15fa5-1f89-4aae-b6df-6a7c33630f43-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.741687 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nppcx\" (UniqueName: \"kubernetes.io/projected/21f15fa5-1f89-4aae-b6df-6a7c33630f43-kube-api-access-nppcx\") pod \"default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m\" (UID: \"21f15fa5-1f89-4aae-b6df-6a7c33630f43\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.828721 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.856445 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerStarted","Data":"bd8e5e049c1bd26e9cbfcd750d20fa2c47955c830e7d64eaa2e70221b3dbd377"} Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.856482 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerStarted","Data":"8179bd87591c1f9881dc2de0c592243d6bc718cdd78eb7e8596a2b1a06322d81"} Feb 17 00:29:18 crc kubenswrapper[4791]: I0217 00:29:18.861253 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"18af9ef88634abf023c824bbe3b1e72d8f54ba620e9c0749e7996850850d7a13"} Feb 17 00:29:19 crc kubenswrapper[4791]: I0217 00:29:19.457639 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m"] Feb 17 00:29:19 crc kubenswrapper[4791]: W0217 00:29:19.472062 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21f15fa5_1f89_4aae_b6df_6a7c33630f43.slice/crio-b94f6544d5320ca846626d17ec597b56d2c2355b298e4f52e8a2f597da4803fd WatchSource:0}: Error finding container b94f6544d5320ca846626d17ec597b56d2c2355b298e4f52e8a2f597da4803fd: Status 404 returned error can't find the container with id b94f6544d5320ca846626d17ec597b56d2c2355b298e4f52e8a2f597da4803fd Feb 17 00:29:19 crc kubenswrapper[4791]: I0217 00:29:19.874621 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerStarted","Data":"bd9ce98b9eef1bbc5c184db64eaad0b62c27677fe65e67cfadff65adff90677c"} Feb 17 00:29:19 crc kubenswrapper[4791]: I0217 00:29:19.874664 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerStarted","Data":"b94f6544d5320ca846626d17ec597b56d2c2355b298e4f52e8a2f597da4803fd"} Feb 17 00:29:23 crc kubenswrapper[4791]: I0217 00:29:23.439566 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Feb 17 00:29:23 crc kubenswrapper[4791]: I0217 00:29:23.479974 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Feb 17 00:29:23 crc kubenswrapper[4791]: I0217 00:29:23.965979 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Feb 17 00:29:24 crc kubenswrapper[4791]: I0217 00:29:24.972857 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:29:24 crc kubenswrapper[4791]: I0217 00:29:24.973178 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:29:26 crc kubenswrapper[4791]: I0217 00:29:26.935853 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"388387806d1ad8d2a29e6701dc33e7f13547353ccae0456fc7d97053e594880f"} Feb 17 00:29:26 crc kubenswrapper[4791]: I0217 00:29:26.938087 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"c9c7d14efbce8d1901ca1ad40d55496b577d720f8fb2bb0d33040a97de15d3e1"} Feb 17 00:29:26 crc kubenswrapper[4791]: I0217 00:29:26.959375 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" podStartSLOduration=8.12337634 podStartE2EDuration="23.959313358s" podCreationTimestamp="2026-02-17 00:29:03 +0000 UTC" firstStartedPulling="2026-02-17 00:29:10.807731143 +0000 UTC m=+1408.287243670" lastFinishedPulling="2026-02-17 00:29:26.643668121 +0000 UTC m=+1424.123180688" observedRunningTime="2026-02-17 00:29:26.953387823 +0000 UTC m=+1424.432900350" watchObservedRunningTime="2026-02-17 00:29:26.959313358 +0000 UTC m=+1424.438825885" Feb 17 00:29:27 crc kubenswrapper[4791]: I0217 00:29:27.947856 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerStarted","Data":"05dc36f926de66665bc35dd51670031593be97eb4ceb070b46accbe83c7ef398"} Feb 17 00:29:27 crc kubenswrapper[4791]: I0217 00:29:27.949202 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerStarted","Data":"10d44b843e268319ad274fcb5d47ba27034d14c8f5811714251ce04d0a229fe1"} Feb 17 00:29:27 crc kubenswrapper[4791]: I0217 00:29:27.954931 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"908cd5621ddb25167215f9bd3d62ca48f923f099199c4bf64035896d11a3df19"} Feb 17 00:29:27 crc kubenswrapper[4791]: I0217 00:29:27.982490 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" podStartSLOduration=5.138138323 podStartE2EDuration="18.982449715s" podCreationTimestamp="2026-02-17 00:29:09 +0000 UTC" firstStartedPulling="2026-02-17 00:29:12.842529151 +0000 UTC m=+1410.322041668" lastFinishedPulling="2026-02-17 00:29:26.686840493 +0000 UTC m=+1424.166353060" observedRunningTime="2026-02-17 00:29:26.97388673 +0000 UTC m=+1424.453399257" watchObservedRunningTime="2026-02-17 00:29:27.982449715 +0000 UTC m=+1425.461962342" Feb 17 00:29:28 crc kubenswrapper[4791]: I0217 00:29:28.017156 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" podStartSLOduration=6.476607603 podStartE2EDuration="22.017092762s" podCreationTimestamp="2026-02-17 00:29:06 +0000 UTC" firstStartedPulling="2026-02-17 00:29:11.296341393 +0000 UTC m=+1408.775853920" lastFinishedPulling="2026-02-17 00:29:26.836826552 +0000 UTC m=+1424.316339079" observedRunningTime="2026-02-17 00:29:28.002808708 +0000 UTC m=+1425.482321275" watchObservedRunningTime="2026-02-17 00:29:28.017092762 +0000 UTC m=+1425.496605329" Feb 17 00:29:28 crc kubenswrapper[4791]: I0217 00:29:28.017374 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" podStartSLOduration=2.526549511 podStartE2EDuration="11.017363539s" podCreationTimestamp="2026-02-17 00:29:17 +0000 UTC" firstStartedPulling="2026-02-17 00:29:18.406273487 +0000 UTC m=+1415.885786014" lastFinishedPulling="2026-02-17 00:29:26.897087515 +0000 UTC m=+1424.376600042" observedRunningTime="2026-02-17 00:29:27.982795806 +0000 UTC m=+1425.462308373" watchObservedRunningTime="2026-02-17 00:29:28.017363539 +0000 UTC m=+1425.496876116" Feb 17 00:29:28 crc kubenswrapper[4791]: I0217 00:29:28.047259 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" podStartSLOduration=2.545586632 podStartE2EDuration="10.047231398s" podCreationTimestamp="2026-02-17 00:29:18 +0000 UTC" firstStartedPulling="2026-02-17 00:29:19.476164157 +0000 UTC m=+1416.955676684" lastFinishedPulling="2026-02-17 00:29:26.977808923 +0000 UTC m=+1424.457321450" observedRunningTime="2026-02-17 00:29:28.028730173 +0000 UTC m=+1425.508242720" watchObservedRunningTime="2026-02-17 00:29:28.047231398 +0000 UTC m=+1425.526744085" Feb 17 00:29:30 crc kubenswrapper[4791]: I0217 00:29:30.733148 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:29:30 crc kubenswrapper[4791]: I0217 00:29:30.733843 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" podUID="99be99ec-fe95-419c-ba3f-b4e3601e433a" containerName="default-interconnect" containerID="cri-o://5c0b32a599acff163bd3b75453552de8cfb3a2023cfaf54423af8eea541e17ba" gracePeriod=30 Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.984021 4791 generic.go:334] "Generic (PLEG): container finished" podID="0d7ac416-17e0-4f86-8786-0afdec7fc240" containerID="74d069a0a7b95b2c13b5d7774c37eff82d25f60a2056b32e4a8abff250caf0b5" exitCode=0 Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.984132 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerDied","Data":"74d069a0a7b95b2c13b5d7774c37eff82d25f60a2056b32e4a8abff250caf0b5"} Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.984976 4791 scope.go:117] "RemoveContainer" containerID="74d069a0a7b95b2c13b5d7774c37eff82d25f60a2056b32e4a8abff250caf0b5" Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.990707 4791 generic.go:334] "Generic (PLEG): container finished" podID="21f15fa5-1f89-4aae-b6df-6a7c33630f43" containerID="bd9ce98b9eef1bbc5c184db64eaad0b62c27677fe65e67cfadff65adff90677c" exitCode=0 Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.990817 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerDied","Data":"bd9ce98b9eef1bbc5c184db64eaad0b62c27677fe65e67cfadff65adff90677c"} Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.991440 4791 scope.go:117] "RemoveContainer" containerID="bd9ce98b9eef1bbc5c184db64eaad0b62c27677fe65e67cfadff65adff90677c" Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.996947 4791 generic.go:334] "Generic (PLEG): container finished" podID="6e985bca-5f71-47e0-bd63-ede2ad79bd7e" containerID="18af9ef88634abf023c824bbe3b1e72d8f54ba620e9c0749e7996850850d7a13" exitCode=0 Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.997021 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerDied","Data":"18af9ef88634abf023c824bbe3b1e72d8f54ba620e9c0749e7996850850d7a13"} Feb 17 00:29:31 crc kubenswrapper[4791]: I0217 00:29:31.997573 4791 scope.go:117] "RemoveContainer" containerID="18af9ef88634abf023c824bbe3b1e72d8f54ba620e9c0749e7996850850d7a13" Feb 17 00:29:32 crc kubenswrapper[4791]: I0217 00:29:32.005056 4791 generic.go:334] "Generic (PLEG): container finished" podID="99be99ec-fe95-419c-ba3f-b4e3601e433a" containerID="5c0b32a599acff163bd3b75453552de8cfb3a2023cfaf54423af8eea541e17ba" exitCode=0 Feb 17 00:29:32 crc kubenswrapper[4791]: I0217 00:29:32.005165 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" event={"ID":"99be99ec-fe95-419c-ba3f-b4e3601e433a","Type":"ContainerDied","Data":"5c0b32a599acff163bd3b75453552de8cfb3a2023cfaf54423af8eea541e17ba"} Feb 17 00:29:32 crc kubenswrapper[4791]: I0217 00:29:32.006859 4791 generic.go:334] "Generic (PLEG): container finished" podID="e3d9a725-2f9c-4fcc-8610-4b297a3d689d" containerID="ac04f34d07519fe9dd3a66842e1d7ce4acf0bcb7a4067dd4c43b6fa708598d8c" exitCode=0 Feb 17 00:29:32 crc kubenswrapper[4791]: I0217 00:29:32.006897 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerDied","Data":"ac04f34d07519fe9dd3a66842e1d7ce4acf0bcb7a4067dd4c43b6fa708598d8c"} Feb 17 00:29:32 crc kubenswrapper[4791]: I0217 00:29:32.007410 4791 scope.go:117] "RemoveContainer" containerID="ac04f34d07519fe9dd3a66842e1d7ce4acf0bcb7a4067dd4c43b6fa708598d8c" Feb 17 00:29:33 crc kubenswrapper[4791]: I0217 00:29:33.013608 4791 generic.go:334] "Generic (PLEG): container finished" podID="4634db00-8e3c-4569-b66c-ef549eda9204" containerID="bd8e5e049c1bd26e9cbfcd750d20fa2c47955c830e7d64eaa2e70221b3dbd377" exitCode=0 Feb 17 00:29:33 crc kubenswrapper[4791]: I0217 00:29:33.013960 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerDied","Data":"bd8e5e049c1bd26e9cbfcd750d20fa2c47955c830e7d64eaa2e70221b3dbd377"} Feb 17 00:29:33 crc kubenswrapper[4791]: I0217 00:29:33.014477 4791 scope.go:117] "RemoveContainer" containerID="bd8e5e049c1bd26e9cbfcd750d20fa2c47955c830e7d64eaa2e70221b3dbd377" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.716954 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.769974 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-sjw5t"] Feb 17 00:29:34 crc kubenswrapper[4791]: E0217 00:29:34.770366 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99be99ec-fe95-419c-ba3f-b4e3601e433a" containerName="default-interconnect" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.770385 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="99be99ec-fe95-419c-ba3f-b4e3601e433a" containerName="default-interconnect" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.770517 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="99be99ec-fe95-419c-ba3f-b4e3601e433a" containerName="default-interconnect" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.771161 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.779690 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-sjw5t"] Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.883974 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884043 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884119 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884181 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884222 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884253 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884315 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt7d8\" (UniqueName: \"kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8\") pod \"99be99ec-fe95-419c-ba3f-b4e3601e433a\" (UID: \"99be99ec-fe95-419c-ba3f-b4e3601e433a\") " Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884483 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884516 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884552 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-config\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884604 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhgv\" (UniqueName: \"kubernetes.io/projected/9caafe24-6ee4-425b-b175-c0901dab223f-kube-api-access-vjhgv\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884665 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-users\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884716 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884747 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.884840 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.889295 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.889777 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.890051 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.890694 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.891333 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.894223 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8" (OuterVolumeSpecName: "kube-api-access-qt7d8") pod "99be99ec-fe95-419c-ba3f-b4e3601e433a" (UID: "99be99ec-fe95-419c-ba3f-b4e3601e433a"). InnerVolumeSpecName "kube-api-access-qt7d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.986230 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-users\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.986295 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.986877 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.986940 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.986964 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987026 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-config\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987081 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhgv\" (UniqueName: \"kubernetes.io/projected/9caafe24-6ee4-425b-b175-c0901dab223f-kube-api-access-vjhgv\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987163 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt7d8\" (UniqueName: \"kubernetes.io/projected/99be99ec-fe95-419c-ba3f-b4e3601e433a-kube-api-access-qt7d8\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987184 4791 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-users\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987197 4791 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987209 4791 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99be99ec-fe95-419c-ba3f-b4e3601e433a-sasl-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987221 4791 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987235 4791 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.987249 4791 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99be99ec-fe95-419c-ba3f-b4e3601e433a-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.988263 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-config\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.993024 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.996515 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-sasl-users\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.997683 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.997892 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:34 crc kubenswrapper[4791]: I0217 00:29:34.997917 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/9caafe24-6ee4-425b-b175-c0901dab223f-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.005259 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhgv\" (UniqueName: \"kubernetes.io/projected/9caafe24-6ee4-425b-b175-c0901dab223f-kube-api-access-vjhgv\") pod \"default-interconnect-68864d46cb-sjw5t\" (UID: \"9caafe24-6ee4-425b-b175-c0901dab223f\") " pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.027325 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" event={"ID":"99be99ec-fe95-419c-ba3f-b4e3601e433a","Type":"ContainerDied","Data":"b23f71e1251fc3bfc8c2584134054c100e44edda849e4b8dcd9907f81f1b91b4"} Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.027380 4791 scope.go:117] "RemoveContainer" containerID="5c0b32a599acff163bd3b75453552de8cfb3a2023cfaf54423af8eea541e17ba" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.027391 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-7q5s9" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.063939 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.071284 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-7q5s9"] Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.095405 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.240388 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99be99ec-fe95-419c-ba3f-b4e3601e433a" path="/var/lib/kubelet/pods/99be99ec-fe95-419c-ba3f-b4e3601e433a/volumes" Feb 17 00:29:35 crc kubenswrapper[4791]: I0217 00:29:35.318467 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-sjw5t"] Feb 17 00:29:35 crc kubenswrapper[4791]: W0217 00:29:35.327461 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9caafe24_6ee4_425b_b175_c0901dab223f.slice/crio-3c674d86e19899584af48acb7188101abc04293d15b5b2d615c7650df2ebbcfc WatchSource:0}: Error finding container 3c674d86e19899584af48acb7188101abc04293d15b5b2d615c7650df2ebbcfc: Status 404 returned error can't find the container with id 3c674d86e19899584af48acb7188101abc04293d15b5b2d615c7650df2ebbcfc Feb 17 00:29:36 crc kubenswrapper[4791]: I0217 00:29:36.034323 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" event={"ID":"9caafe24-6ee4-425b-b175-c0901dab223f","Type":"ContainerStarted","Data":"a9257b2f6143e5fdbef6d9970265ea2019fa470cadca2aa64b11d9eed6b20c4f"} Feb 17 00:29:36 crc kubenswrapper[4791]: I0217 00:29:36.034365 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" event={"ID":"9caafe24-6ee4-425b-b175-c0901dab223f","Type":"ContainerStarted","Data":"3c674d86e19899584af48acb7188101abc04293d15b5b2d615c7650df2ebbcfc"} Feb 17 00:29:36 crc kubenswrapper[4791]: I0217 00:29:36.061129 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-sjw5t" podStartSLOduration=6.061090665 podStartE2EDuration="6.061090665s" podCreationTimestamp="2026-02-17 00:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 00:29:36.056243804 +0000 UTC m=+1433.535756321" watchObservedRunningTime="2026-02-17 00:29:36.061090665 +0000 UTC m=+1433.540603212" Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.042965 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"750934549b030909bb4d598855ca653fb2ecc38c95f4654335e4f92f04b972c6"} Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.046191 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerStarted","Data":"ce59d847af01180940df3f09f6d2496601ef0cdc8918bf8c9cacb859dc5df681"} Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.048835 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerStarted","Data":"698ff17b1cc097462d0e76ae27f971dd27b7d83fda12b9acb2a842abdcc4a4c6"} Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.051869 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"440d8a71773e87a97c65f1be17bc069de0c55e9696ecb2c7657667e4e4affa5c"} Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.056023 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"df1187e2bef149dfd491161a3578bd8f2992980e1c6e4dd9f867e6531a6d761f"} Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.883296 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.884388 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.886530 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.886983 4791 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Feb 17 00:29:37 crc kubenswrapper[4791]: I0217 00:29:37.894363 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.031610 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/43948ca6-1e04-4a7f-867d-d5f6d69d240d-qdr-test-config\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.031655 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72ht\" (UniqueName: \"kubernetes.io/projected/43948ca6-1e04-4a7f-867d-d5f6d69d240d-kube-api-access-v72ht\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.031685 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/43948ca6-1e04-4a7f-867d-d5f6d69d240d-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.064694 4791 generic.go:334] "Generic (PLEG): container finished" podID="4634db00-8e3c-4569-b66c-ef549eda9204" containerID="ce59d847af01180940df3f09f6d2496601ef0cdc8918bf8c9cacb859dc5df681" exitCode=0 Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.064776 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerDied","Data":"ce59d847af01180940df3f09f6d2496601ef0cdc8918bf8c9cacb859dc5df681"} Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.064836 4791 scope.go:117] "RemoveContainer" containerID="bd8e5e049c1bd26e9cbfcd750d20fa2c47955c830e7d64eaa2e70221b3dbd377" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.065444 4791 scope.go:117] "RemoveContainer" containerID="ce59d847af01180940df3f09f6d2496601ef0cdc8918bf8c9cacb859dc5df681" Feb 17 00:29:38 crc kubenswrapper[4791]: E0217 00:29:38.065742 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk_service-telemetry(4634db00-8e3c-4569-b66c-ef549eda9204)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" podUID="4634db00-8e3c-4569-b66c-ef549eda9204" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.067274 4791 generic.go:334] "Generic (PLEG): container finished" podID="6e985bca-5f71-47e0-bd63-ede2ad79bd7e" containerID="440d8a71773e87a97c65f1be17bc069de0c55e9696ecb2c7657667e4e4affa5c" exitCode=0 Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.067349 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerDied","Data":"440d8a71773e87a97c65f1be17bc069de0c55e9696ecb2c7657667e4e4affa5c"} Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.067726 4791 scope.go:117] "RemoveContainer" containerID="440d8a71773e87a97c65f1be17bc069de0c55e9696ecb2c7657667e4e4affa5c" Feb 17 00:29:38 crc kubenswrapper[4791]: E0217 00:29:38.067929 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p_service-telemetry(6e985bca-5f71-47e0-bd63-ede2ad79bd7e)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" podUID="6e985bca-5f71-47e0-bd63-ede2ad79bd7e" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.069723 4791 generic.go:334] "Generic (PLEG): container finished" podID="e3d9a725-2f9c-4fcc-8610-4b297a3d689d" containerID="df1187e2bef149dfd491161a3578bd8f2992980e1c6e4dd9f867e6531a6d761f" exitCode=0 Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.069781 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerDied","Data":"df1187e2bef149dfd491161a3578bd8f2992980e1c6e4dd9f867e6531a6d761f"} Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.070331 4791 scope.go:117] "RemoveContainer" containerID="df1187e2bef149dfd491161a3578bd8f2992980e1c6e4dd9f867e6531a6d761f" Feb 17 00:29:38 crc kubenswrapper[4791]: E0217 00:29:38.070621 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9_service-telemetry(e3d9a725-2f9c-4fcc-8610-4b297a3d689d)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" podUID="e3d9a725-2f9c-4fcc-8610-4b297a3d689d" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.072731 4791 generic.go:334] "Generic (PLEG): container finished" podID="0d7ac416-17e0-4f86-8786-0afdec7fc240" containerID="750934549b030909bb4d598855ca653fb2ecc38c95f4654335e4f92f04b972c6" exitCode=0 Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.072777 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerDied","Data":"750934549b030909bb4d598855ca653fb2ecc38c95f4654335e4f92f04b972c6"} Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.073098 4791 scope.go:117] "RemoveContainer" containerID="750934549b030909bb4d598855ca653fb2ecc38c95f4654335e4f92f04b972c6" Feb 17 00:29:38 crc kubenswrapper[4791]: E0217 00:29:38.073284 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8_service-telemetry(0d7ac416-17e0-4f86-8786-0afdec7fc240)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" podUID="0d7ac416-17e0-4f86-8786-0afdec7fc240" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.083031 4791 generic.go:334] "Generic (PLEG): container finished" podID="21f15fa5-1f89-4aae-b6df-6a7c33630f43" containerID="698ff17b1cc097462d0e76ae27f971dd27b7d83fda12b9acb2a842abdcc4a4c6" exitCode=0 Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.083076 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerDied","Data":"698ff17b1cc097462d0e76ae27f971dd27b7d83fda12b9acb2a842abdcc4a4c6"} Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.083718 4791 scope.go:117] "RemoveContainer" containerID="698ff17b1cc097462d0e76ae27f971dd27b7d83fda12b9acb2a842abdcc4a4c6" Feb 17 00:29:38 crc kubenswrapper[4791]: E0217 00:29:38.083926 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m_service-telemetry(21f15fa5-1f89-4aae-b6df-6a7c33630f43)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" podUID="21f15fa5-1f89-4aae-b6df-6a7c33630f43" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.107315 4791 scope.go:117] "RemoveContainer" containerID="18af9ef88634abf023c824bbe3b1e72d8f54ba620e9c0749e7996850850d7a13" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.132876 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/43948ca6-1e04-4a7f-867d-d5f6d69d240d-qdr-test-config\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.132936 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72ht\" (UniqueName: \"kubernetes.io/projected/43948ca6-1e04-4a7f-867d-d5f6d69d240d-kube-api-access-v72ht\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.132983 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/43948ca6-1e04-4a7f-867d-d5f6d69d240d-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.133662 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/43948ca6-1e04-4a7f-867d-d5f6d69d240d-qdr-test-config\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.143346 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/43948ca6-1e04-4a7f-867d-d5f6d69d240d-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.143513 4791 scope.go:117] "RemoveContainer" containerID="ac04f34d07519fe9dd3a66842e1d7ce4acf0bcb7a4067dd4c43b6fa708598d8c" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.165817 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72ht\" (UniqueName: \"kubernetes.io/projected/43948ca6-1e04-4a7f-867d-d5f6d69d240d-kube-api-access-v72ht\") pod \"qdr-test\" (UID: \"43948ca6-1e04-4a7f-867d-d5f6d69d240d\") " pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.207927 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.240392 4791 scope.go:117] "RemoveContainer" containerID="74d069a0a7b95b2c13b5d7774c37eff82d25f60a2056b32e4a8abff250caf0b5" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.274423 4791 scope.go:117] "RemoveContainer" containerID="bd9ce98b9eef1bbc5c184db64eaad0b62c27677fe65e67cfadff65adff90677c" Feb 17 00:29:38 crc kubenswrapper[4791]: I0217 00:29:38.642541 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Feb 17 00:29:38 crc kubenswrapper[4791]: W0217 00:29:38.646234 4791 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43948ca6_1e04_4a7f_867d_d5f6d69d240d.slice/crio-20f076de0e2e2ce145756cbedde9e7832677ea840dc57057ccc25152e0394c39 WatchSource:0}: Error finding container 20f076de0e2e2ce145756cbedde9e7832677ea840dc57057ccc25152e0394c39: Status 404 returned error can't find the container with id 20f076de0e2e2ce145756cbedde9e7832677ea840dc57057ccc25152e0394c39 Feb 17 00:29:39 crc kubenswrapper[4791]: I0217 00:29:39.096173 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"43948ca6-1e04-4a7f-867d-d5f6d69d240d","Type":"ContainerStarted","Data":"20f076de0e2e2ce145756cbedde9e7832677ea840dc57057ccc25152e0394c39"} Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.162177 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"43948ca6-1e04-4a7f-867d-d5f6d69d240d","Type":"ContainerStarted","Data":"a33231ba3a43a59d314c17bc1db0dffe12835a7ebb8205582c8fe6f3e5425e5d"} Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.178870 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.51780443 podStartE2EDuration="10.178855847s" podCreationTimestamp="2026-02-17 00:29:37 +0000 UTC" firstStartedPulling="2026-02-17 00:29:38.648691498 +0000 UTC m=+1436.128204025" lastFinishedPulling="2026-02-17 00:29:46.309742915 +0000 UTC m=+1443.789255442" observedRunningTime="2026-02-17 00:29:47.175977238 +0000 UTC m=+1444.655489765" watchObservedRunningTime="2026-02-17 00:29:47.178855847 +0000 UTC m=+1444.658368374" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.525735 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-k8frg"] Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.527491 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.530195 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.532251 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.532479 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.534622 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.536530 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-k8frg"] Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.536712 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.537475 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.593846 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsx9p\" (UniqueName: \"kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.593888 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.593914 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.593938 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.593957 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.594141 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.594163 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695059 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695177 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695331 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695369 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695419 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsx9p\" (UniqueName: \"kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695456 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.695495 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.696006 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.696738 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.696796 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.697095 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.697093 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.697508 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.737494 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsx9p\" (UniqueName: \"kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p\") pod \"stf-smoketest-smoke1-k8frg\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:47 crc kubenswrapper[4791]: I0217 00:29:47.847435 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.011650 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.023432 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.023548 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.099909 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-k8frg"] Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.100533 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqsb9\" (UniqueName: \"kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9\") pod \"curl\" (UID: \"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094\") " pod="service-telemetry/curl" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.172043 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerStarted","Data":"fb78639120c2ceb67338064e7f2d7aba445e0c94d0f5aa427da45e085399f0fb"} Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.201610 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqsb9\" (UniqueName: \"kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9\") pod \"curl\" (UID: \"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094\") " pod="service-telemetry/curl" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.225790 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqsb9\" (UniqueName: \"kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9\") pod \"curl\" (UID: \"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094\") " pod="service-telemetry/curl" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.353639 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 17 00:29:48 crc kubenswrapper[4791]: I0217 00:29:48.782353 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Feb 17 00:29:49 crc kubenswrapper[4791]: I0217 00:29:49.178935 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094","Type":"ContainerStarted","Data":"690037d92f4fa5c73c59322e6169368ac5590df450d4325ba298a0d94081d46b"} Feb 17 00:29:49 crc kubenswrapper[4791]: I0217 00:29:49.220468 4791 scope.go:117] "RemoveContainer" containerID="440d8a71773e87a97c65f1be17bc069de0c55e9696ecb2c7657667e4e4affa5c" Feb 17 00:29:49 crc kubenswrapper[4791]: I0217 00:29:49.220523 4791 scope.go:117] "RemoveContainer" containerID="750934549b030909bb4d598855ca653fb2ecc38c95f4654335e4f92f04b972c6" Feb 17 00:29:49 crc kubenswrapper[4791]: I0217 00:29:49.220588 4791 scope.go:117] "RemoveContainer" containerID="ce59d847af01180940df3f09f6d2496601ef0cdc8918bf8c9cacb859dc5df681" Feb 17 00:29:50 crc kubenswrapper[4791]: I0217 00:29:50.200783 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk" event={"ID":"4634db00-8e3c-4569-b66c-ef549eda9204","Type":"ContainerStarted","Data":"02ff514e88ff70cec9d63b954878a0505cc7c820d435f196394093abc4284b2a"} Feb 17 00:29:50 crc kubenswrapper[4791]: I0217 00:29:50.204889 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8" event={"ID":"0d7ac416-17e0-4f86-8786-0afdec7fc240","Type":"ContainerStarted","Data":"97ce3fc7429c7ac09971828969b2933496e787c2cc86f2e4d6b34ea7a9bf3cd4"} Feb 17 00:29:50 crc kubenswrapper[4791]: I0217 00:29:50.207890 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p" event={"ID":"6e985bca-5f71-47e0-bd63-ede2ad79bd7e","Type":"ContainerStarted","Data":"dcaf66b043a2c6b72c0b3c5c46769a60605df0b5eb5a14becae05bced65affbb"} Feb 17 00:29:51 crc kubenswrapper[4791]: I0217 00:29:51.214461 4791 generic.go:334] "Generic (PLEG): container finished" podID="5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094" containerID="138acf522734f6f66553cdc60197343cc0def84117ff762dbfd2617d285bfd2d" exitCode=0 Feb 17 00:29:51 crc kubenswrapper[4791]: I0217 00:29:51.214499 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094","Type":"ContainerDied","Data":"138acf522734f6f66553cdc60197343cc0def84117ff762dbfd2617d285bfd2d"} Feb 17 00:29:51 crc kubenswrapper[4791]: I0217 00:29:51.220643 4791 scope.go:117] "RemoveContainer" containerID="698ff17b1cc097462d0e76ae27f971dd27b7d83fda12b9acb2a842abdcc4a4c6" Feb 17 00:29:53 crc kubenswrapper[4791]: I0217 00:29:53.227891 4791 scope.go:117] "RemoveContainer" containerID="df1187e2bef149dfd491161a3578bd8f2992980e1c6e4dd9f867e6531a6d761f" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.417123 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.507229 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqsb9\" (UniqueName: \"kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9\") pod \"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094\" (UID: \"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094\") " Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.516539 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9" (OuterVolumeSpecName: "kube-api-access-zqsb9") pod "5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094" (UID: "5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094"). InnerVolumeSpecName "kube-api-access-zqsb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.550338 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094/curl/0.log" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.609606 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqsb9\" (UniqueName: \"kubernetes.io/projected/5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094-kube-api-access-zqsb9\") on node \"crc\" DevicePath \"\"" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.804215 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-z8m2z_135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd/prometheus-webhook-snmp/0.log" Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.972656 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:29:54 crc kubenswrapper[4791]: I0217 00:29:54.972717 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:29:55 crc kubenswrapper[4791]: I0217 00:29:55.245229 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094","Type":"ContainerDied","Data":"690037d92f4fa5c73c59322e6169368ac5590df450d4325ba298a0d94081d46b"} Feb 17 00:29:55 crc kubenswrapper[4791]: I0217 00:29:55.245274 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="690037d92f4fa5c73c59322e6169368ac5590df450d4325ba298a0d94081d46b" Feb 17 00:29:55 crc kubenswrapper[4791]: I0217 00:29:55.245337 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 17 00:29:59 crc kubenswrapper[4791]: I0217 00:29:59.271335 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerStarted","Data":"92e80ec48707a383e8e983cd6a7e22c9b6a5c7d87140df423504e893b8aba7ed"} Feb 17 00:29:59 crc kubenswrapper[4791]: I0217 00:29:59.274641 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9" event={"ID":"e3d9a725-2f9c-4fcc-8610-4b297a3d689d","Type":"ContainerStarted","Data":"1cefd2a346da979f0d4271d76cfe3cfaaaa0b7b2c62557a9196303556d822ac9"} Feb 17 00:29:59 crc kubenswrapper[4791]: I0217 00:29:59.276604 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m" event={"ID":"21f15fa5-1f89-4aae-b6df-6a7c33630f43","Type":"ContainerStarted","Data":"0913487f3336339a89c2c386a8b3f9c8ee342543562c2a1621a25e49b0185945"} Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.142679 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9"] Feb 17 00:30:00 crc kubenswrapper[4791]: E0217 00:30:00.143326 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094" containerName="curl" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.143348 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094" containerName="curl" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.143457 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbf81dd-6a9e-4c2b-a46a-1461a3a5b094" containerName="curl" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.143900 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.145949 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.146962 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.151680 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9"] Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.196861 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.196954 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjpgh\" (UniqueName: \"kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.196992 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.298702 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.298828 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.298888 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjpgh\" (UniqueName: \"kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.299764 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.312000 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.329051 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjpgh\" (UniqueName: \"kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh\") pod \"collect-profiles-29521470-l9cv9\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.460599 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:00 crc kubenswrapper[4791]: I0217 00:30:00.925842 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9"] Feb 17 00:30:08 crc kubenswrapper[4791]: I0217 00:30:08.363673 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" event={"ID":"8399bca3-cb20-4884-9b21-8ec3dae4c326","Type":"ContainerStarted","Data":"8b3cc8284e64a2f5812158dddd5a1b23b6c0e660df4f9cc40fa53f34e218ece8"} Feb 17 00:30:09 crc kubenswrapper[4791]: I0217 00:30:09.373428 4791 generic.go:334] "Generic (PLEG): container finished" podID="8399bca3-cb20-4884-9b21-8ec3dae4c326" containerID="6dc8ea3daa418770ab1f163521fedb78769049fea4d06277ba9a21c58b357e21" exitCode=0 Feb 17 00:30:09 crc kubenswrapper[4791]: I0217 00:30:09.374220 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" event={"ID":"8399bca3-cb20-4884-9b21-8ec3dae4c326","Type":"ContainerDied","Data":"6dc8ea3daa418770ab1f163521fedb78769049fea4d06277ba9a21c58b357e21"} Feb 17 00:30:09 crc kubenswrapper[4791]: I0217 00:30:09.376760 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerStarted","Data":"d13f5fd2822d8c39c1a9232da45b4fcc7917aa08355fdc5dc30d6280c800f247"} Feb 17 00:30:09 crc kubenswrapper[4791]: I0217 00:30:09.429505 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-k8frg" podStartSLOduration=2.304329313 podStartE2EDuration="22.429486391s" podCreationTimestamp="2026-02-17 00:29:47 +0000 UTC" firstStartedPulling="2026-02-17 00:29:48.110687048 +0000 UTC m=+1445.590199575" lastFinishedPulling="2026-02-17 00:30:08.235844126 +0000 UTC m=+1465.715356653" observedRunningTime="2026-02-17 00:30:09.418524451 +0000 UTC m=+1466.898036978" watchObservedRunningTime="2026-02-17 00:30:09.429486391 +0000 UTC m=+1466.908998908" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.687345 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.768190 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume\") pod \"8399bca3-cb20-4884-9b21-8ec3dae4c326\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.768297 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjpgh\" (UniqueName: \"kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh\") pod \"8399bca3-cb20-4884-9b21-8ec3dae4c326\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.768424 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume\") pod \"8399bca3-cb20-4884-9b21-8ec3dae4c326\" (UID: \"8399bca3-cb20-4884-9b21-8ec3dae4c326\") " Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.769145 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume" (OuterVolumeSpecName: "config-volume") pod "8399bca3-cb20-4884-9b21-8ec3dae4c326" (UID: "8399bca3-cb20-4884-9b21-8ec3dae4c326"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.773371 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh" (OuterVolumeSpecName: "kube-api-access-fjpgh") pod "8399bca3-cb20-4884-9b21-8ec3dae4c326" (UID: "8399bca3-cb20-4884-9b21-8ec3dae4c326"). InnerVolumeSpecName "kube-api-access-fjpgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.776235 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8399bca3-cb20-4884-9b21-8ec3dae4c326" (UID: "8399bca3-cb20-4884-9b21-8ec3dae4c326"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.870259 4791 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8399bca3-cb20-4884-9b21-8ec3dae4c326-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.870320 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjpgh\" (UniqueName: \"kubernetes.io/projected/8399bca3-cb20-4884-9b21-8ec3dae4c326-kube-api-access-fjpgh\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:10 crc kubenswrapper[4791]: I0217 00:30:10.870340 4791 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8399bca3-cb20-4884-9b21-8ec3dae4c326-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:11 crc kubenswrapper[4791]: I0217 00:30:11.396840 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" event={"ID":"8399bca3-cb20-4884-9b21-8ec3dae4c326","Type":"ContainerDied","Data":"8b3cc8284e64a2f5812158dddd5a1b23b6c0e660df4f9cc40fa53f34e218ece8"} Feb 17 00:30:11 crc kubenswrapper[4791]: I0217 00:30:11.396882 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b3cc8284e64a2f5812158dddd5a1b23b6c0e660df4f9cc40fa53f34e218ece8" Feb 17 00:30:11 crc kubenswrapper[4791]: I0217 00:30:11.396932 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521470-l9cv9" Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.973632 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.974534 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.974588 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.975324 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.975394 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57" gracePeriod=600 Feb 17 00:30:24 crc kubenswrapper[4791]: I0217 00:30:24.984856 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-z8m2z_135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd/prometheus-webhook-snmp/0.log" Feb 17 00:30:25 crc kubenswrapper[4791]: I0217 00:30:25.515703 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57" exitCode=0 Feb 17 00:30:25 crc kubenswrapper[4791]: I0217 00:30:25.515841 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57"} Feb 17 00:30:25 crc kubenswrapper[4791]: I0217 00:30:25.515868 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerStarted","Data":"83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104"} Feb 17 00:30:25 crc kubenswrapper[4791]: I0217 00:30:25.515883 4791 scope.go:117] "RemoveContainer" containerID="71eb8be49ce7a773ea08603b5bc6a4c5e5daf5e96a1784fa4c29b0f51c425ed1" Feb 17 00:30:32 crc kubenswrapper[4791]: I0217 00:30:32.575235 4791 generic.go:334] "Generic (PLEG): container finished" podID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerID="92e80ec48707a383e8e983cd6a7e22c9b6a5c7d87140df423504e893b8aba7ed" exitCode=0 Feb 17 00:30:32 crc kubenswrapper[4791]: I0217 00:30:32.575342 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerDied","Data":"92e80ec48707a383e8e983cd6a7e22c9b6a5c7d87140df423504e893b8aba7ed"} Feb 17 00:30:32 crc kubenswrapper[4791]: I0217 00:30:32.576262 4791 scope.go:117] "RemoveContainer" containerID="92e80ec48707a383e8e983cd6a7e22c9b6a5c7d87140df423504e893b8aba7ed" Feb 17 00:30:40 crc kubenswrapper[4791]: I0217 00:30:40.649994 4791 generic.go:334] "Generic (PLEG): container finished" podID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerID="d13f5fd2822d8c39c1a9232da45b4fcc7917aa08355fdc5dc30d6280c800f247" exitCode=0 Feb 17 00:30:40 crc kubenswrapper[4791]: I0217 00:30:40.650541 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerDied","Data":"d13f5fd2822d8c39c1a9232da45b4fcc7917aa08355fdc5dc30d6280c800f247"} Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.919044 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969463 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969526 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969559 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969617 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969648 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsx9p\" (UniqueName: \"kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969707 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.969764 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher\") pod \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\" (UID: \"e8590bb8-aabf-4bf2-b28d-f1c7b0872780\") " Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.975462 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p" (OuterVolumeSpecName: "kube-api-access-lsx9p") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "kube-api-access-lsx9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.985816 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.986683 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.988703 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.990181 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.992974 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:41 crc kubenswrapper[4791]: I0217 00:30:41.999267 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "e8590bb8-aabf-4bf2-b28d-f1c7b0872780" (UID: "e8590bb8-aabf-4bf2-b28d-f1c7b0872780"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.071958 4791 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072012 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsx9p\" (UniqueName: \"kubernetes.io/projected/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-kube-api-access-lsx9p\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072030 4791 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072042 4791 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072054 4791 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072066 4791 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-healthcheck-log\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.072078 4791 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e8590bb8-aabf-4bf2-b28d-f1c7b0872780-sensubility-config\") on node \"crc\" DevicePath \"\"" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.672736 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-k8frg" event={"ID":"e8590bb8-aabf-4bf2-b28d-f1c7b0872780","Type":"ContainerDied","Data":"fb78639120c2ceb67338064e7f2d7aba445e0c94d0f5aa427da45e085399f0fb"} Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.672995 4791 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb78639120c2ceb67338064e7f2d7aba445e0c94d0f5aa427da45e085399f0fb" Feb 17 00:30:42 crc kubenswrapper[4791]: I0217 00:30:42.672875 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-k8frg" Feb 17 00:30:44 crc kubenswrapper[4791]: I0217 00:30:44.078016 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-k8frg_e8590bb8-aabf-4bf2-b28d-f1c7b0872780/smoketest-collectd/0.log" Feb 17 00:30:44 crc kubenswrapper[4791]: I0217 00:30:44.353030 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-k8frg_e8590bb8-aabf-4bf2-b28d-f1c7b0872780/smoketest-ceilometer/0.log" Feb 17 00:30:44 crc kubenswrapper[4791]: I0217 00:30:44.656941 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-sjw5t_9caafe24-6ee4-425b-b175-c0901dab223f/default-interconnect/0.log" Feb 17 00:30:44 crc kubenswrapper[4791]: I0217 00:30:44.919285 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8_0d7ac416-17e0-4f86-8786-0afdec7fc240/bridge/2.log" Feb 17 00:30:45 crc kubenswrapper[4791]: I0217 00:30:45.204056 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-2pnc8_0d7ac416-17e0-4f86-8786-0afdec7fc240/sg-core/0.log" Feb 17 00:30:45 crc kubenswrapper[4791]: I0217 00:30:45.545555 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk_4634db00-8e3c-4569-b66c-ef549eda9204/bridge/2.log" Feb 17 00:30:45 crc kubenswrapper[4791]: I0217 00:30:45.857303 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-598fd557d8-m4mpk_4634db00-8e3c-4569-b66c-ef549eda9204/sg-core/0.log" Feb 17 00:30:46 crc kubenswrapper[4791]: I0217 00:30:46.170564 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9_e3d9a725-2f9c-4fcc-8610-4b297a3d689d/bridge/2.log" Feb 17 00:30:46 crc kubenswrapper[4791]: I0217 00:30:46.466278 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-qcmn9_e3d9a725-2f9c-4fcc-8610-4b297a3d689d/sg-core/0.log" Feb 17 00:30:46 crc kubenswrapper[4791]: I0217 00:30:46.790200 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m_21f15fa5-1f89-4aae-b6df-6a7c33630f43/bridge/2.log" Feb 17 00:30:47 crc kubenswrapper[4791]: I0217 00:30:47.041482 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-54bb4c7f8f-m4t7m_21f15fa5-1f89-4aae-b6df-6a7c33630f43/sg-core/0.log" Feb 17 00:30:47 crc kubenswrapper[4791]: I0217 00:30:47.294913 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p_6e985bca-5f71-47e0-bd63-ede2ad79bd7e/bridge/2.log" Feb 17 00:30:47 crc kubenswrapper[4791]: I0217 00:30:47.571447 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-rpx6p_6e985bca-5f71-47e0-bd63-ede2ad79bd7e/sg-core/0.log" Feb 17 00:30:51 crc kubenswrapper[4791]: I0217 00:30:51.573924 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-6f787cb998-k5dw6_29c809ce-6a9b-4496-9c8e-8cd4506d926b/operator/0.log" Feb 17 00:30:51 crc kubenswrapper[4791]: I0217 00:30:51.920427 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_af5ac410-05c3-4bf2-b2fa-fd7abd1d51a9/prometheus/0.log" Feb 17 00:30:52 crc kubenswrapper[4791]: I0217 00:30:52.200767 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_335ade17-e7c1-487c-9e12-ad3d0d3610b0/elasticsearch/0.log" Feb 17 00:30:52 crc kubenswrapper[4791]: I0217 00:30:52.538150 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-z8m2z_135a7c9c-cdfa-4baa-a4b3-ea9f6392d1cd/prometheus-webhook-snmp/0.log" Feb 17 00:30:52 crc kubenswrapper[4791]: I0217 00:30:52.864035 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_df247d19-621c-4c9b-a436-d4f263dcb5ae/alertmanager/0.log" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.201527 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:30:58 crc kubenswrapper[4791]: E0217 00:30:58.202368 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-collectd" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202385 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-collectd" Feb 17 00:30:58 crc kubenswrapper[4791]: E0217 00:30:58.202406 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8399bca3-cb20-4884-9b21-8ec3dae4c326" containerName="collect-profiles" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202414 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="8399bca3-cb20-4884-9b21-8ec3dae4c326" containerName="collect-profiles" Feb 17 00:30:58 crc kubenswrapper[4791]: E0217 00:30:58.202434 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-ceilometer" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202443 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-ceilometer" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202576 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-ceilometer" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202590 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8590bb8-aabf-4bf2-b28d-f1c7b0872780" containerName="smoketest-collectd" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.202602 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="8399bca3-cb20-4884-9b21-8ec3dae4c326" containerName="collect-profiles" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.203676 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.218511 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.318847 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.319003 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztf7b\" (UniqueName: \"kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.319041 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.420100 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.420213 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztf7b\" (UniqueName: \"kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.420242 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.420790 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.421068 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.445574 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztf7b\" (UniqueName: \"kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b\") pod \"community-operators-kj28p\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:58 crc kubenswrapper[4791]: I0217 00:30:58.519834 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:30:59 crc kubenswrapper[4791]: I0217 00:30:59.015939 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:30:59 crc kubenswrapper[4791]: I0217 00:30:59.810630 4791 generic.go:334] "Generic (PLEG): container finished" podID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerID="a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5" exitCode=0 Feb 17 00:30:59 crc kubenswrapper[4791]: I0217 00:30:59.810742 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerDied","Data":"a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5"} Feb 17 00:30:59 crc kubenswrapper[4791]: I0217 00:30:59.810919 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerStarted","Data":"71a6b3b5f0cefd2907aba0a8a16ba8ee674dbdf8214bb14dcede67bd40deffee"} Feb 17 00:31:00 crc kubenswrapper[4791]: I0217 00:31:00.830327 4791 generic.go:334] "Generic (PLEG): container finished" podID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerID="dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8" exitCode=0 Feb 17 00:31:00 crc kubenswrapper[4791]: I0217 00:31:00.830385 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerDied","Data":"dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8"} Feb 17 00:31:01 crc kubenswrapper[4791]: I0217 00:31:01.839814 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerStarted","Data":"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794"} Feb 17 00:31:01 crc kubenswrapper[4791]: I0217 00:31:01.859713 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kj28p" podStartSLOduration=2.462567457 podStartE2EDuration="3.859695876s" podCreationTimestamp="2026-02-17 00:30:58 +0000 UTC" firstStartedPulling="2026-02-17 00:30:59.813150083 +0000 UTC m=+1517.292662620" lastFinishedPulling="2026-02-17 00:31:01.210278512 +0000 UTC m=+1518.689791039" observedRunningTime="2026-02-17 00:31:01.855431393 +0000 UTC m=+1519.334943920" watchObservedRunningTime="2026-02-17 00:31:01.859695876 +0000 UTC m=+1519.339208403" Feb 17 00:31:07 crc kubenswrapper[4791]: I0217 00:31:07.971048 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-6f88b4fbc-h6xn7_c8e979be-fe5a-4d89-b1a6-0260fffdd27c/operator/0.log" Feb 17 00:31:08 crc kubenswrapper[4791]: I0217 00:31:08.520024 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:08 crc kubenswrapper[4791]: I0217 00:31:08.520097 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:08 crc kubenswrapper[4791]: I0217 00:31:08.596198 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:08 crc kubenswrapper[4791]: I0217 00:31:08.938332 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:08 crc kubenswrapper[4791]: I0217 00:31:08.989373 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:31:10 crc kubenswrapper[4791]: I0217 00:31:10.918016 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kj28p" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="registry-server" containerID="cri-o://5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794" gracePeriod=2 Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.317590 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.409780 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content\") pod \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.409878 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities\") pod \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.409956 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztf7b\" (UniqueName: \"kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b\") pod \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\" (UID: \"b014ddb5-2da5-47ba-85bd-4e93e8f670c4\") " Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.410853 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities" (OuterVolumeSpecName: "utilities") pod "b014ddb5-2da5-47ba-85bd-4e93e8f670c4" (UID: "b014ddb5-2da5-47ba-85bd-4e93e8f670c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.417249 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b" (OuterVolumeSpecName: "kube-api-access-ztf7b") pod "b014ddb5-2da5-47ba-85bd-4e93e8f670c4" (UID: "b014ddb5-2da5-47ba-85bd-4e93e8f670c4"). InnerVolumeSpecName "kube-api-access-ztf7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.489616 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b014ddb5-2da5-47ba-85bd-4e93e8f670c4" (UID: "b014ddb5-2da5-47ba-85bd-4e93e8f670c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.511655 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.511719 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztf7b\" (UniqueName: \"kubernetes.io/projected/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-kube-api-access-ztf7b\") on node \"crc\" DevicePath \"\"" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.511745 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b014ddb5-2da5-47ba-85bd-4e93e8f670c4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.926361 4791 generic.go:334] "Generic (PLEG): container finished" podID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerID="5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794" exitCode=0 Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.926451 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerDied","Data":"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794"} Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.926491 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj28p" event={"ID":"b014ddb5-2da5-47ba-85bd-4e93e8f670c4","Type":"ContainerDied","Data":"71a6b3b5f0cefd2907aba0a8a16ba8ee674dbdf8214bb14dcede67bd40deffee"} Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.926520 4791 scope.go:117] "RemoveContainer" containerID="5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.926564 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj28p" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.947537 4791 scope.go:117] "RemoveContainer" containerID="dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8" Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.962938 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.983401 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kj28p"] Feb 17 00:31:11 crc kubenswrapper[4791]: I0217 00:31:11.989631 4791 scope.go:117] "RemoveContainer" containerID="a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.003140 4791 scope.go:117] "RemoveContainer" containerID="5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794" Feb 17 00:31:12 crc kubenswrapper[4791]: E0217 00:31:12.003577 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794\": container with ID starting with 5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794 not found: ID does not exist" containerID="5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.003608 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794"} err="failed to get container status \"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794\": rpc error: code = NotFound desc = could not find container \"5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794\": container with ID starting with 5fce4957f3803d0bc504e549f47003789886d2c421d6af3b0117de275e094794 not found: ID does not exist" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.003628 4791 scope.go:117] "RemoveContainer" containerID="dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8" Feb 17 00:31:12 crc kubenswrapper[4791]: E0217 00:31:12.003932 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8\": container with ID starting with dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8 not found: ID does not exist" containerID="dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.003980 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8"} err="failed to get container status \"dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8\": rpc error: code = NotFound desc = could not find container \"dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8\": container with ID starting with dec6d0ebc8a287558eb364bf2a9baeac2db2dbbbd7d079fe46f1a5f38f09b7d8 not found: ID does not exist" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.004012 4791 scope.go:117] "RemoveContainer" containerID="a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5" Feb 17 00:31:12 crc kubenswrapper[4791]: E0217 00:31:12.004664 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5\": container with ID starting with a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5 not found: ID does not exist" containerID="a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.004757 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5"} err="failed to get container status \"a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5\": rpc error: code = NotFound desc = could not find container \"a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5\": container with ID starting with a6e367ac52fd977a5b414318cc5fa96f46c1094fadde60a04794646e24a14aa5 not found: ID does not exist" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.088124 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-6f787cb998-k5dw6_29c809ce-6a9b-4496-9c8e-8cd4506d926b/operator/0.log" Feb 17 00:31:12 crc kubenswrapper[4791]: I0217 00:31:12.364135 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_43948ca6-1e04-4a7f-867d-d5f6d69d240d/qdr/0.log" Feb 17 00:31:13 crc kubenswrapper[4791]: I0217 00:31:13.234378 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" path="/var/lib/kubelet/pods/b014ddb5-2da5-47ba-85bd-4e93e8f670c4/volumes" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.928584 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-glgdj/must-gather-f94nt"] Feb 17 00:31:36 crc kubenswrapper[4791]: E0217 00:31:36.930672 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="extract-utilities" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.930775 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="extract-utilities" Feb 17 00:31:36 crc kubenswrapper[4791]: E0217 00:31:36.930855 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="extract-content" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.930930 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="extract-content" Feb 17 00:31:36 crc kubenswrapper[4791]: E0217 00:31:36.931007 4791 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="registry-server" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.931136 4791 state_mem.go:107] "Deleted CPUSet assignment" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="registry-server" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.931383 4791 memory_manager.go:354] "RemoveStaleState removing state" podUID="b014ddb5-2da5-47ba-85bd-4e93e8f670c4" containerName="registry-server" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.932535 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.939707 4791 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-glgdj"/"default-dockercfg-54jg6" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.942053 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-glgdj"/"kube-root-ca.crt" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.942256 4791 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-glgdj"/"openshift-service-ca.crt" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.958505 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-glgdj/must-gather-f94nt"] Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.995209 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4x67\" (UniqueName: \"kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:36 crc kubenswrapper[4791]: I0217 00:31:36.995289 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.096972 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.097377 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4x67\" (UniqueName: \"kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.098073 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.124095 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4x67\" (UniqueName: \"kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67\") pod \"must-gather-f94nt\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.252892 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:31:37 crc kubenswrapper[4791]: I0217 00:31:37.491371 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-glgdj/must-gather-f94nt"] Feb 17 00:31:38 crc kubenswrapper[4791]: I0217 00:31:38.132025 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgdj/must-gather-f94nt" event={"ID":"240269b5-7b03-4e43-8d40-106c95b85777","Type":"ContainerStarted","Data":"e90a6164f4d9cc20a6c3e3e71a51caefc00efd669525e95904943bf687926582"} Feb 17 00:31:46 crc kubenswrapper[4791]: I0217 00:31:46.194081 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgdj/must-gather-f94nt" event={"ID":"240269b5-7b03-4e43-8d40-106c95b85777","Type":"ContainerStarted","Data":"7e570ede014f3b62e7289481867c7b647495eebf2f846f5e59e0597dd23ce004"} Feb 17 00:31:46 crc kubenswrapper[4791]: I0217 00:31:46.194961 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgdj/must-gather-f94nt" event={"ID":"240269b5-7b03-4e43-8d40-106c95b85777","Type":"ContainerStarted","Data":"4aacef8b7dbadb146ac970bd0266869527a6bdc9bd8084d697396cb1fdecc57d"} Feb 17 00:31:46 crc kubenswrapper[4791]: I0217 00:31:46.220268 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-glgdj/must-gather-f94nt" podStartSLOduration=2.380193307 podStartE2EDuration="10.220246989s" podCreationTimestamp="2026-02-17 00:31:36 +0000 UTC" firstStartedPulling="2026-02-17 00:31:37.51394872 +0000 UTC m=+1554.993461247" lastFinishedPulling="2026-02-17 00:31:45.354002402 +0000 UTC m=+1562.833514929" observedRunningTime="2026-02-17 00:31:46.211849767 +0000 UTC m=+1563.691362304" watchObservedRunningTime="2026-02-17 00:31:46.220246989 +0000 UTC m=+1563.699759536" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.301299 4791 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.303462 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.314442 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.394648 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2mbr\" (UniqueName: \"kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.394978 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.395023 4791 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.495841 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2mbr\" (UniqueName: \"kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.495891 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.495931 4791 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.496282 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.496344 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.523944 4791 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2mbr\" (UniqueName: \"kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr\") pod \"redhat-operators-h7fp9\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:21 crc kubenswrapper[4791]: I0217 00:32:21.662756 4791 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:22 crc kubenswrapper[4791]: I0217 00:32:22.101538 4791 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:22 crc kubenswrapper[4791]: I0217 00:32:22.485319 4791 generic.go:334] "Generic (PLEG): container finished" podID="ec42b5a3-c15d-4fef-9370-86ae9da61992" containerID="5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600" exitCode=0 Feb 17 00:32:22 crc kubenswrapper[4791]: I0217 00:32:22.485362 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerDied","Data":"5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600"} Feb 17 00:32:22 crc kubenswrapper[4791]: I0217 00:32:22.485389 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerStarted","Data":"e3c1ae6008c18b4a67644ca33fef834fc97a5ba76e470d56247db28c4077af92"} Feb 17 00:32:24 crc kubenswrapper[4791]: I0217 00:32:24.501818 4791 generic.go:334] "Generic (PLEG): container finished" podID="ec42b5a3-c15d-4fef-9370-86ae9da61992" containerID="2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a" exitCode=0 Feb 17 00:32:24 crc kubenswrapper[4791]: I0217 00:32:24.501886 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerDied","Data":"2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a"} Feb 17 00:32:25 crc kubenswrapper[4791]: I0217 00:32:25.511243 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerStarted","Data":"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d"} Feb 17 00:32:25 crc kubenswrapper[4791]: I0217 00:32:25.532098 4791 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h7fp9" podStartSLOduration=1.89400008 podStartE2EDuration="4.532077065s" podCreationTimestamp="2026-02-17 00:32:21 +0000 UTC" firstStartedPulling="2026-02-17 00:32:22.48653079 +0000 UTC m=+1599.966043307" lastFinishedPulling="2026-02-17 00:32:25.124607725 +0000 UTC m=+1602.604120292" observedRunningTime="2026-02-17 00:32:25.527407949 +0000 UTC m=+1603.006920476" watchObservedRunningTime="2026-02-17 00:32:25.532077065 +0000 UTC m=+1603.011589602" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.277518 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kqr99_03d7a8df-a8a3-4b34-bd28-d554ae70875a/control-plane-machine-set-operator/0.log" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.445966 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5nwz7_9c752f56-7754-4718-aea5-cb41d6ac4253/machine-api-operator/0.log" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.454961 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5nwz7_9c752f56-7754-4718-aea5-cb41d6ac4253/kube-rbac-proxy/0.log" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.663567 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.663622 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:31 crc kubenswrapper[4791]: I0217 00:32:31.705798 4791 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:32 crc kubenswrapper[4791]: I0217 00:32:32.622826 4791 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:32 crc kubenswrapper[4791]: I0217 00:32:32.670188 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:34 crc kubenswrapper[4791]: I0217 00:32:34.585821 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h7fp9" podUID="ec42b5a3-c15d-4fef-9370-86ae9da61992" containerName="registry-server" containerID="cri-o://71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d" gracePeriod=2 Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.008943 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.128035 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2mbr\" (UniqueName: \"kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr\") pod \"ec42b5a3-c15d-4fef-9370-86ae9da61992\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.128195 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities\") pod \"ec42b5a3-c15d-4fef-9370-86ae9da61992\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.128320 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content\") pod \"ec42b5a3-c15d-4fef-9370-86ae9da61992\" (UID: \"ec42b5a3-c15d-4fef-9370-86ae9da61992\") " Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.129474 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities" (OuterVolumeSpecName: "utilities") pod "ec42b5a3-c15d-4fef-9370-86ae9da61992" (UID: "ec42b5a3-c15d-4fef-9370-86ae9da61992"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.138102 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr" (OuterVolumeSpecName: "kube-api-access-g2mbr") pod "ec42b5a3-c15d-4fef-9370-86ae9da61992" (UID: "ec42b5a3-c15d-4fef-9370-86ae9da61992"). InnerVolumeSpecName "kube-api-access-g2mbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.229795 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2mbr\" (UniqueName: \"kubernetes.io/projected/ec42b5a3-c15d-4fef-9370-86ae9da61992-kube-api-access-g2mbr\") on node \"crc\" DevicePath \"\"" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.229842 4791 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.258898 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec42b5a3-c15d-4fef-9370-86ae9da61992" (UID: "ec42b5a3-c15d-4fef-9370-86ae9da61992"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.330744 4791 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec42b5a3-c15d-4fef-9370-86ae9da61992-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.594301 4791 generic.go:334] "Generic (PLEG): container finished" podID="ec42b5a3-c15d-4fef-9370-86ae9da61992" containerID="71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d" exitCode=0 Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.594378 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerDied","Data":"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d"} Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.594417 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7fp9" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.594442 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7fp9" event={"ID":"ec42b5a3-c15d-4fef-9370-86ae9da61992","Type":"ContainerDied","Data":"e3c1ae6008c18b4a67644ca33fef834fc97a5ba76e470d56247db28c4077af92"} Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.594483 4791 scope.go:117] "RemoveContainer" containerID="71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.612822 4791 scope.go:117] "RemoveContainer" containerID="2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.644793 4791 scope.go:117] "RemoveContainer" containerID="5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.645784 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.654303 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h7fp9"] Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.677944 4791 scope.go:117] "RemoveContainer" containerID="71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d" Feb 17 00:32:35 crc kubenswrapper[4791]: E0217 00:32:35.678368 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d\": container with ID starting with 71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d not found: ID does not exist" containerID="71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.678417 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d"} err="failed to get container status \"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d\": rpc error: code = NotFound desc = could not find container \"71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d\": container with ID starting with 71a5a4440a95f3e5edb3331ff9c962fb10246ca6a4d78e993445f1467450d51d not found: ID does not exist" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.678453 4791 scope.go:117] "RemoveContainer" containerID="2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a" Feb 17 00:32:35 crc kubenswrapper[4791]: E0217 00:32:35.678793 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a\": container with ID starting with 2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a not found: ID does not exist" containerID="2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.678829 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a"} err="failed to get container status \"2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a\": rpc error: code = NotFound desc = could not find container \"2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a\": container with ID starting with 2e70432ea866312b83604e2c0311fc971f1a540d9eda873767043a0d54469a5a not found: ID does not exist" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.678849 4791 scope.go:117] "RemoveContainer" containerID="5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600" Feb 17 00:32:35 crc kubenswrapper[4791]: E0217 00:32:35.679145 4791 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600\": container with ID starting with 5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600 not found: ID does not exist" containerID="5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600" Feb 17 00:32:35 crc kubenswrapper[4791]: I0217 00:32:35.679192 4791 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600"} err="failed to get container status \"5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600\": rpc error: code = NotFound desc = could not find container \"5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600\": container with ID starting with 5ba0aaf55242c1e2fee9e95b6f19940cedef58731e631cb722505a8df02f3600 not found: ID does not exist" Feb 17 00:32:37 crc kubenswrapper[4791]: I0217 00:32:37.236884 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec42b5a3-c15d-4fef-9370-86ae9da61992" path="/var/lib/kubelet/pods/ec42b5a3-c15d-4fef-9370-86ae9da61992/volumes" Feb 17 00:32:43 crc kubenswrapper[4791]: I0217 00:32:43.944013 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-9dsmn_bf759390-4034-42c9-811b-531aeabd3ed6/cert-manager-controller/0.log" Feb 17 00:32:44 crc kubenswrapper[4791]: I0217 00:32:44.091875 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-l9798_aca3c38d-0dd8-4457-854a-b392ba180087/cert-manager-cainjector/0.log" Feb 17 00:32:44 crc kubenswrapper[4791]: I0217 00:32:44.149561 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-42zhs_4b26c415-6a42-4bda-abbd-cf394bc94043/cert-manager-webhook/0.log" Feb 17 00:32:54 crc kubenswrapper[4791]: I0217 00:32:54.973062 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:32:54 crc kubenswrapper[4791]: I0217 00:32:54.974311 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:32:58 crc kubenswrapper[4791]: I0217 00:32:58.328362 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rw6pj_f73f7b40-6611-465e-ae69-d2f70ce77651/prometheus-operator/0.log" Feb 17 00:32:58 crc kubenswrapper[4791]: I0217 00:32:58.448570 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f_b046e97f-6343-4e3f-ae0a-0fb40687d992/prometheus-operator-admission-webhook/0.log" Feb 17 00:32:58 crc kubenswrapper[4791]: I0217 00:32:58.507206 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5_8c43370f-07b8-4f84-b716-34af90be5850/prometheus-operator-admission-webhook/0.log" Feb 17 00:32:58 crc kubenswrapper[4791]: I0217 00:32:58.636374 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-v2lwp_307585d5-5ed8-43df-b5d8-977729339610/operator/0.log" Feb 17 00:32:58 crc kubenswrapper[4791]: I0217 00:32:58.686031 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-mk4lp_3b110234-d36d-4ced-a2be-7913bbb84d2a/perses-operator/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.042646 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.187693 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/pull/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.197424 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/pull/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.208518 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.397332 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/pull/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.400784 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.412905 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1ngsjb_f156dae1-1d4a-47b3-835e-016325f1981c/extract/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.567224 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.723811 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/pull/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.738392 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.764432 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/pull/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.885828 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/util/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.907824 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/extract/0.log" Feb 17 00:33:13 crc kubenswrapper[4791]: I0217 00:33:13.933185 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f5gq9r_bfce146a-61fa-4821-ab43-8fd35dc5fe07/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.085960 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.222717 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.255696 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.280663 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.442167 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/extract/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.450970 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.462351 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56pgjm_ba03d7dd-7e00-4b21-a86b-a2cabeb36ed9/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.601278 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.777991 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.793657 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.808519 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.920771 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/pull/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.949153 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/util/0.log" Feb 17 00:33:14 crc kubenswrapper[4791]: I0217 00:33:14.964486 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08smjjl_306a7321-68e3-4f13-95d0-3c3dbee8b24f/extract/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.116250 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-utilities/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.247066 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-utilities/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.285459 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-content/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.285497 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-content/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.458296 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-utilities/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.500360 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/extract-content/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.656660 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npbnh_c6f055fb-42f4-4699-8dd3-d93710f92ec8/registry-server/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.674760 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-utilities/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.862369 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-content/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.865815 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-utilities/0.log" Feb 17 00:33:15 crc kubenswrapper[4791]: I0217 00:33:15.887087 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-content/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.007273 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-content/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.069618 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/extract-utilities/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.218180 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-t8x7k_66e06ad0-6874-4a52-94d8-76da74f7336b/marketplace-operator/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.271279 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-djrqd_79b4304a-5553-411d-a6df-e2af898a22b0/registry-server/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.291280 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-utilities/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.455514 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-content/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.456198 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-content/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.483159 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-utilities/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.627026 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-utilities/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.630788 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/extract-content/0.log" Feb 17 00:33:16 crc kubenswrapper[4791]: I0217 00:33:16.929530 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sv4n6_f9f068a6-ed4e-4080-a05b-40562b5e8711/registry-server/0.log" Feb 17 00:33:24 crc kubenswrapper[4791]: I0217 00:33:24.973865 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:33:24 crc kubenswrapper[4791]: I0217 00:33:24.974646 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:33:29 crc kubenswrapper[4791]: I0217 00:33:29.622583 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4b5fc6bd-6pl7f_b046e97f-6343-4e3f-ae0a-0fb40687d992/prometheus-operator-admission-webhook/0.log" Feb 17 00:33:29 crc kubenswrapper[4791]: I0217 00:33:29.655483 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4b5fc6bd-drxh5_8c43370f-07b8-4f84-b716-34af90be5850/prometheus-operator-admission-webhook/0.log" Feb 17 00:33:29 crc kubenswrapper[4791]: I0217 00:33:29.658367 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rw6pj_f73f7b40-6611-465e-ae69-d2f70ce77651/prometheus-operator/0.log" Feb 17 00:33:29 crc kubenswrapper[4791]: I0217 00:33:29.747596 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-v2lwp_307585d5-5ed8-43df-b5d8-977729339610/operator/0.log" Feb 17 00:33:29 crc kubenswrapper[4791]: I0217 00:33:29.814407 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-mk4lp_3b110234-d36d-4ced-a2be-7913bbb84d2a/perses-operator/0.log" Feb 17 00:33:54 crc kubenswrapper[4791]: I0217 00:33:54.972579 4791 patch_prober.go:28] interesting pod/machine-config-daemon-9klkw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 00:33:54 crc kubenswrapper[4791]: I0217 00:33:54.973131 4791 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 00:33:54 crc kubenswrapper[4791]: I0217 00:33:54.973181 4791 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" Feb 17 00:33:54 crc kubenswrapper[4791]: I0217 00:33:54.973725 4791 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104"} pod="openshift-machine-config-operator/machine-config-daemon-9klkw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 00:33:54 crc kubenswrapper[4791]: I0217 00:33:54.973781 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerName="machine-config-daemon" containerID="cri-o://83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" gracePeriod=600 Feb 17 00:33:55 crc kubenswrapper[4791]: I0217 00:33:55.264099 4791 generic.go:334] "Generic (PLEG): container finished" podID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" exitCode=0 Feb 17 00:33:55 crc kubenswrapper[4791]: I0217 00:33:55.264152 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" event={"ID":"02a3a228-86d6-4d54-ad63-0d36c9d59af5","Type":"ContainerDied","Data":"83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104"} Feb 17 00:33:55 crc kubenswrapper[4791]: I0217 00:33:55.264579 4791 scope.go:117] "RemoveContainer" containerID="0ab0ea25cf5596643285f502958b16b41d67465e6ae2b0bd26ff7c3c0dfd7a57" Feb 17 00:33:55 crc kubenswrapper[4791]: E0217 00:33:55.606547 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:33:56 crc kubenswrapper[4791]: I0217 00:33:56.277446 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:33:56 crc kubenswrapper[4791]: E0217 00:33:56.277814 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:34:07 crc kubenswrapper[4791]: I0217 00:34:07.224416 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:34:07 crc kubenswrapper[4791]: E0217 00:34:07.225195 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:34:19 crc kubenswrapper[4791]: I0217 00:34:19.538689 4791 generic.go:334] "Generic (PLEG): container finished" podID="240269b5-7b03-4e43-8d40-106c95b85777" containerID="4aacef8b7dbadb146ac970bd0266869527a6bdc9bd8084d697396cb1fdecc57d" exitCode=0 Feb 17 00:34:19 crc kubenswrapper[4791]: I0217 00:34:19.539281 4791 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgdj/must-gather-f94nt" event={"ID":"240269b5-7b03-4e43-8d40-106c95b85777","Type":"ContainerDied","Data":"4aacef8b7dbadb146ac970bd0266869527a6bdc9bd8084d697396cb1fdecc57d"} Feb 17 00:34:19 crc kubenswrapper[4791]: I0217 00:34:19.540216 4791 scope.go:117] "RemoveContainer" containerID="4aacef8b7dbadb146ac970bd0266869527a6bdc9bd8084d697396cb1fdecc57d" Feb 17 00:34:20 crc kubenswrapper[4791]: I0217 00:34:20.220895 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:34:20 crc kubenswrapper[4791]: E0217 00:34:20.221460 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:34:20 crc kubenswrapper[4791]: I0217 00:34:20.344366 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgdj_must-gather-f94nt_240269b5-7b03-4e43-8d40-106c95b85777/gather/0.log" Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.468084 4791 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-glgdj/must-gather-f94nt"] Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.470773 4791 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-glgdj/must-gather-f94nt" podUID="240269b5-7b03-4e43-8d40-106c95b85777" containerName="copy" containerID="cri-o://7e570ede014f3b62e7289481867c7b647495eebf2f846f5e59e0597dd23ce004" gracePeriod=2 Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.478703 4791 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-glgdj/must-gather-f94nt"] Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.618742 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgdj_must-gather-f94nt_240269b5-7b03-4e43-8d40-106c95b85777/copy/0.log" Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.619606 4791 generic.go:334] "Generic (PLEG): container finished" podID="240269b5-7b03-4e43-8d40-106c95b85777" containerID="7e570ede014f3b62e7289481867c7b647495eebf2f846f5e59e0597dd23ce004" exitCode=143 Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.854510 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgdj_must-gather-f94nt_240269b5-7b03-4e43-8d40-106c95b85777/copy/0.log" Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.855100 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.980080 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output\") pod \"240269b5-7b03-4e43-8d40-106c95b85777\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.980135 4791 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4x67\" (UniqueName: \"kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67\") pod \"240269b5-7b03-4e43-8d40-106c95b85777\" (UID: \"240269b5-7b03-4e43-8d40-106c95b85777\") " Feb 17 00:34:27 crc kubenswrapper[4791]: I0217 00:34:27.996233 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67" (OuterVolumeSpecName: "kube-api-access-p4x67") pod "240269b5-7b03-4e43-8d40-106c95b85777" (UID: "240269b5-7b03-4e43-8d40-106c95b85777"). InnerVolumeSpecName "kube-api-access-p4x67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.033800 4791 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "240269b5-7b03-4e43-8d40-106c95b85777" (UID: "240269b5-7b03-4e43-8d40-106c95b85777"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.082067 4791 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240269b5-7b03-4e43-8d40-106c95b85777-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.082458 4791 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4x67\" (UniqueName: \"kubernetes.io/projected/240269b5-7b03-4e43-8d40-106c95b85777-kube-api-access-p4x67\") on node \"crc\" DevicePath \"\"" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.630424 4791 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgdj_must-gather-f94nt_240269b5-7b03-4e43-8d40-106c95b85777/copy/0.log" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.630831 4791 scope.go:117] "RemoveContainer" containerID="7e570ede014f3b62e7289481867c7b647495eebf2f846f5e59e0597dd23ce004" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.630923 4791 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgdj/must-gather-f94nt" Feb 17 00:34:28 crc kubenswrapper[4791]: I0217 00:34:28.654234 4791 scope.go:117] "RemoveContainer" containerID="4aacef8b7dbadb146ac970bd0266869527a6bdc9bd8084d697396cb1fdecc57d" Feb 17 00:34:29 crc kubenswrapper[4791]: I0217 00:34:29.228911 4791 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="240269b5-7b03-4e43-8d40-106c95b85777" path="/var/lib/kubelet/pods/240269b5-7b03-4e43-8d40-106c95b85777/volumes" Feb 17 00:34:31 crc kubenswrapper[4791]: I0217 00:34:31.221636 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:34:31 crc kubenswrapper[4791]: E0217 00:34:31.222551 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:34:45 crc kubenswrapper[4791]: I0217 00:34:45.221205 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:34:45 crc kubenswrapper[4791]: E0217 00:34:45.222329 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:35:00 crc kubenswrapper[4791]: I0217 00:35:00.220534 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:35:00 crc kubenswrapper[4791]: E0217 00:35:00.221099 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:35:14 crc kubenswrapper[4791]: I0217 00:35:14.220840 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:35:14 crc kubenswrapper[4791]: E0217 00:35:14.222255 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:35:28 crc kubenswrapper[4791]: I0217 00:35:28.221591 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:35:28 crc kubenswrapper[4791]: E0217 00:35:28.222655 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:35:40 crc kubenswrapper[4791]: I0217 00:35:40.220094 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:35:40 crc kubenswrapper[4791]: E0217 00:35:40.221003 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:35:54 crc kubenswrapper[4791]: I0217 00:35:54.220960 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:35:54 crc kubenswrapper[4791]: E0217 00:35:54.222035 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5" Feb 17 00:36:05 crc kubenswrapper[4791]: I0217 00:36:05.220478 4791 scope.go:117] "RemoveContainer" containerID="83a45c14446dd0994ae78aba5b94b4303c07e73628e6156002bb2f0eb1460104" Feb 17 00:36:05 crc kubenswrapper[4791]: E0217 00:36:05.221436 4791 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9klkw_openshift-machine-config-operator(02a3a228-86d6-4d54-ad63-0d36c9d59af5)\"" pod="openshift-machine-config-operator/machine-config-daemon-9klkw" podUID="02a3a228-86d6-4d54-ad63-0d36c9d59af5"